Disruptive Technology that Could Transform Government-Citizen Relationships


David Raths at GovTech: “William Gibson, the science fiction writer who coined the term “cyberspace,” once said: “The future is already here — it’s just not very evenly distributed.” That may be exactly the way to look at the selection of disruptive technologies we have chosen to highlight in eight critical areas of government, ranging from public safety to health to transportation. ….

PUBLIC SAFETY: WEARABLE TECH IS TRANSFORMING EMERGENCY RESPONSE

The wearable technology market is expected to grow from $20 billion in 2015 to almost $70 billion in 2025, according to research firm IDTechEx. As commercial applications bloom, more will find their way into the public sector and emergency response.

This year has seen an increase in the number of police departments using body cameras. And already under development are wireless devices that monitor a responder’s breathing, heart rate and blood pressure, as well as potentially harmful environmental conditions, and relay concerns back to incident command.

But rather than sitting back and waiting for the market to develop, the U.S. Department of Homeland Security is determined to spur innovation in the field. DHS’ research and development arm is funding a startup accelerator program called Emerge managed by the Center for Innovative Technology (CIT), a Virginia-based nonprofit. Two accelerators, in Texas and Illinois, will work with 10 to 15 startups this year to develop wearable products and adopt them for first responder use….

HEALTH & HUMAN SERVICES: ‘HOT-SPOTTING’ FOR POPULATION HEALTH MANAGEMENT

A hot health-care trend is population health management: using data to improve health at a community level as well as an individual level. The growth in sophistication of GIS tools has allowed public health researchers to more clearly identify and start addressing health resource disparities.

Dr. Jeffrey Brenner, a Camden, N.J.-based physician, uses data gathered in a health information exchange (HIE) to target high-cost individuals. The Camden Coalition of Healthcare Providers uses the HIE data to identify high-cost “hot spots” — high-rise buildings where a large number of hospital emergency room “super users” live. By identifying and working with these individuals on patient-centered care coordination issues, the coalition has been able to reduce emergency room use and in-patient stays….

PARKS & RECREATION: TRACKING TREES FOR A BETTER FUTURE

A combination of advances in mobile data collection systems and geocoding lets natural resources and parks agencies be more proactive about collecting tree data, managing urban forests and quantifying their value, as forests become increasingly important resources in an era of climate change.

Philadelphia Parks and Recreation has added approximately 2 million trees to its database in the past few years. It plans to create a digital management system for all of them. Los Angeles City Parks uses the Davey Tree Expert Co.’s Web-based TreeKeeper management software to manage existing tree inventories and administer work orders. The department can also more easily look at species balance to manage against pests, disease and drought….

CORRECTIONS: VIDEO-BASED TOOLS TRANSFORM PRISONS AND JAILS

Videoconferencing is disrupting business as usual in U.S. jails and prisons in two ways: One is the rising use of telemedicine to reduce inmate health-care costs and to increase access to certain types of care for prisoners. The other is video visitation between inmates and families.

A March 2015 report by Southern California Public Radio noted that the federal court-appointed receiver overseeing inmate health care in California is reviewing telemedicine capabilities to reduce costly overtime billing by physicians and nurses at prisons. In one year, overtime has more than doubled for this branch of corrections, from more than $12 million to nearly $30 million….

FINANCE & BUDGETING: DATA PORTALS OFFER TRANSPARENCY AT UNPRECEDENTED LEVELS

The transparency and open data movements have hit the government finance sector in a big way and promise to be an area of innovation in the years ahead.

A partnership between Ohio Treasurer Josh Mandel and the finance visualization startup OpenGov will result in one of the most sweeping statewide transparency efforts to date.

The initiative offers 3,900-plus local governments — from townships, cities and counties to school districts and more — a chance to place revenues and expenditures online free of charge through the state’s budget transparency site OhioCheckbook.com. Citizens will be able to track local government revenues and expenditures via interactive graphs that illustrate not only a bird’s-eye view of a budget, but also the granular details of check-by-check spending….

DMV: DRIVERS’ LICENSES: THERE WILL SOON BE AN APP FOR THAT

The laminated driver’s license you keep in your wallet may eventually give way to an app on your smartphone, and that change may have wider significance for how citizens interact digitally with their government. Legislatures in at least three states have seen bills introduced authorizing their transportation departments to begin piloting digital drivers’ licenses…..

TRANSPORTATION & MASS TRANSIT: BIG BREAKTHROUGHS ARE JUST AROUND THE CORNER

Nothing is likely to be more disruptive to transportation, mass transit and urban planning than the double whammy of connected vehicle technology and autonomous vehicles.
The U.S. Department of Transportation expects great things from the connected vehicles of the future ­— and that future may be just around the corner. Vehicle-to-infrastructure communication capabilities and anonymous information from passengers’ wireless devices relayed through dedicated short-range connections could provide transportation agencies with improved traffic, transit and parking data, making it easier to manage transportation systems and improve traffic safety….. (More)”

Scientists Are Hoarding Data And It’s Ruining Medical Research


Ben Goldacre at Buzzfeed: “We like to imagine that science is a world of clean answers, with priestly personnel in white coats, emitting perfect outputs, from glass and metal buildings full of blinking lights.

The reality is a mess. A collection of papers published on Wednesday — on one of the most commonly used medical treatments in the world — show just how bad things have become. But they also give hope.

The papers are about deworming pills that kill parasites in the gut, at extremely low cost. In developing countries, battles over the usefulness of these drugs have become so contentious that some people call them “The Worm Wars.”…

This “deworm everybody” approach has been driven by a single, hugely influential trial published in 2004 by two economists, Edward Miguel and Michael Kremer. This trial, done in Kenya, found that deworming whole schools improved children’s health, school performance, and school attendance. What’s more, these benefits apparently extended to children in schools several miles away, even when those children didn’t get any deworming tablets (presumably, people assumed, by interrupting worm transmission from one child to the next).

A decade later, in 2013, these two economists did something that very few researchers have ever done. They handed over their entire dataset to independent researchers on the other side of the world, so that their analyses could be checked in public. What happened next has every right to kick through a revolution in science and medicine….

This kind of statistical replication is almost vanishingly rare. A recent study set out to find all well-documented cases in which the raw data from a randomized trial had been reanalysed. It found just 37, out of many thousands. What’s more, only five were conducted by entirely independent researchers, people not involved in the original trial.

These reanalyses were more than mere academic fun and games. The ultimate outcomes of the trials changed, with terrifying frequency: One-third of them were so different that the take-home message of the trial shifted.

This matters. Medical trials aren’t conducted out of an abstract philosophical interest, for the intellectual benefit of some rarefied class in ivory towers. Researchers do trials as a service, to find out what works, because they intend to act on the results. It matters that trials get an answer that is not just accurate, but also reliable.

So here we have an odd situation. Independent reanalysis can improve the results of clinical trials, and help us not go down blind alleys, or give the wrong treatment to the wrong people. It’s pretty cheap, compared to the phenomenal administrative cost of conducting a trial. And it spots problems at an alarmingly high rate.

And yet, this kind of independent check is almost never done. Why not? Partly, it’s resources. But more than that, when people do request raw data, all too often the original researchers duck, dive, or simply ignore requests….

Two years ago I published a book on problems in medicine. Front and center in this howl was “publication bias,” the problem of clinical trial results being routinely and legally withheld from doctors, researchers, and patients. The best available evidence — from dozens of studieschasing results for completed trials — shows that around half of all clinical trials fail to report their results. The same is true of industry trials, and academic trials. What’s more, trials with positive results are about twice as likely to post results, so we see a biased half of the literature.

This is a cancer at the core of evidence-based medicine. When half the evidence is withheld, doctors and patients cannot make informed decisions about which treatment is best. When I wrote about this, various people from the pharmaceutical industry cropped up to claim that the problem was all in the past. So I befriended some campaigners, we assembled a group of senior academics, and started the AllTrials.net campaign with one clear message: “All trials must be registered, with their full methods and results reported.”

Dozens of academic studies had been published on the issue, and that alone clearly wasn’t enough. So we started collecting signatures, and we now have more than 85,000 supporters. At the same time we sought out institutional support. Eighty patient groups signed up in the first month, with hundreds more since then. Some of the biggest research funders, and even government bodies, have now signed up.

This week we’re announcing support from a group of 85 pension funds and asset managers, representing more than 3.5 trillion euros in funds, who will be asking the pharma companies they invest in to make plans to ensure that all trials — past, present, and future — report their results properly. Next week, after two years of activity in Europe, we launch our campaign in the U.S….(More)”

Innovation Experiments: Researching Technical Advance, Knowledge Production and the Design of Supporting Institutions


Paper by Kevin J. Boudreau and Karim Lakhani: “This paper discusses several challenges in designing field experiments to better understand how organizational and institutional design shapes innovation outcomes and the production of knowledge. We proceed to describe the field experimental research program carried out by our Crowd Innovation Laboratory at Harvard University to clarify how we have attempted to address these research design challenges. This program has simultaneously solved important practical innovation problems for partner organizations, like NASA and Harvard Medical School, while contributing research advances, particularly in relation to innovation contests and tournaments….(More)

White House to make public records more public


Lisa Rein at the Washington Post: “The law that’s supposed to keep citizens in the know about what their government is doing is about to get more robust.

Starting this week, seven agencies — including the Environmental Protection Agency and the Office of the Director of National Intelligence —  launched a new effort to put online the records they distribute to requesters under the Freedom of Information Act (FOIA).

So if a journalist, nonprofit group or corporation asks for the records, what they see, the public also will see. Documents still will be redacted where necessary to protect what the government decides is sensitive information, an area that’s often disputed but won’t change with this policy.

The Obama administration’s new Open Government initiative began quietly on the agencies’ Web sites days after FOIA’s 49th anniversary. It’s a response to years of pressure from open-government groups and lawmakers to boost public access to records of government decisions, deliberations and policies.

The “release to one is release to all” policy will start as a six-month pilot at the EPA, the Office of the Director of National Intelligence, the Millennium Challenge Corporation and within some offices at the Department of Homeland Security, the Defense Department, the Justice Department and the National Archives and Records Administration….(More)”

Democratising the Data Revolution


Jonathan Gray at Open Knowledge: “What will the “data revolution” do? What will it be about? What will it count? What kinds of risks and harms might it bring? Whom and what will it serve? And who will get to decide?

Today we are launching a new discussion paper on “Democratising the Data Revolution”, which is intended to advance thinking and action around civil society engagement with the data revolution. It looks beyond the disclosure of existing information, towards more ambitious and substantive forms of democratic engagement with data infrastructures.1

It concludes with a series of questions about what practical steps institutions and civil society organisations might take to change what is measured and how, and how these measurements are put to work.

You can download the full PDF report here, or continue to read on in this blog post.

What Counts?

How might civil society actors shape the data revolution? In particular, how might they go beyond the question of what data is disclosed towards looking at what is measured in the first place? To kickstart discussion around this topic, we will look at three kinds of intervention: changing existing forms of measurement, advocating new forms of measurement and undertaking new forms of measurement.

Changing Existing Forms of Measurement

Rather than just focusing on the transparency, disclosure and openness of public information, civil society groups can argue for changing what is measured with existing data infrastructures. One example of this is recent campaigning around company ownership in the UK. Advocacy groups wanted to unpick networks of corporate ownership and control in order to support their campaigning and investigations around tax avoidance, tax evasion and illicit financial flows.

While the UK company register recorded information about “nominal ownership”, it did not include information about so-called “beneficial ownership”, or who ultimately benefits from the ownership and control of companies. Campaigners undertook an extensive programme of activities to advocate for changes and extensions to existing data infrastructures – including via legislation, software systems, and administrative protocols.2

Advocating New Forms of Measurement

As well as changing or recalibrating existing forms of measurement, campaigners and civil society organisations can make the case for the measurement of things which were not previously measured. For example, over the past several decades social and political campaigning has resulted in new indicators about many different issues – such as gender inequality, health, work, disability, pollution or education.3 In such cases activists aimed to establish a given indicator as important and relevant for public institutions, decision makers, and broader publics – in order to, for example, inform policy development or resource allocation.

Undertaking New Forms of Measurement

Historically, many civil society organisations and advocacy groups have collected their own data to make the case for action on issues that they work on – from human rights abuses to endangered species….(More)”

The case for data ethics


Steven Tiell at Accenture: “Personal data is the coin of the digital realm, which for business leaders creates a critical dilemma. Companies are being asked to gather more types of data faster than ever to maintain a competitive edge in the digital marketplace; at the same time, however, they are being asked to provide pervasive and granular control mechanisms over the use of that data throughout the data supply chain.

The stakes couldn’t be higher. If organizations, or the platforms they use to deliver services, fail to secure personal data, they expose themselves to tremendous risk—from eroding brand value and the hard-won trust of established vendors and customers to ceding market share, from violating laws to costing top executives their jobs.

To distinguish their businesses in this marketplace, leaders should be asking themselves two questions. What are the appropriate standards and practices our company needs to have in place to govern the handling of data? And how can our company make strong data controls a value proposition for our employees, customers and partners?

Defining effective compliance activities to support legal and regulatory obligations can be a starting point. However, mere compliance with existing regulations—which are, for the most part, focused on privacy—is insufficient. Respect for privacy is a byproduct of high ethical standards, but it is only part of the picture. Companies need to embrace data ethics, an expansive set of practices and behaviors grounded in a moral framework for the betterment of a community (however defined).

 RAISING THE BAR

Why ethics? When communities of people—in this case, the business community at large—encounter new influences, the way they respond to and engage with those influences becomes the community’s shared ethics. Individuals who behave in accordance with these community norms are said to be moral, and those who are exemplary are able to gain the trust of their community.

Over time, as ethical standards within a community shift, the bar for trustworthiness is raised on the assumption that participants in civil society must, at a minimum, adhere to the rule of law. And thus, to maintain moral authority and a high degree of trust, actors in a community must constantly evolve to adopt the highest ethical standards.

Actors in the big data community, where security and privacy are at the core of relationships with stakeholders, must adhere to a high ethical standard to gain this trust. This requires them to go beyond privacy law and existing data control measures. It will also reward those who practice strong ethical behaviors and a high degree of transparency at every stage of the data supply chain. The most successful actors will become the platform-based trust authorities, and others will depend on these platforms for disclosure, sharing and analytics of big data assets.

Data ethics becomes a value proposition only once controls and capabilities are in place to granularly manage data assets at scale throughout the data supply chain. It is also beneficial when a community shares the same behavioral norms and taxonomy to describe the data itself, the ethical decision points along the data supply chain, and how those decisions lead to beneficial or harmful impacts….(More)”

Open Innovation, Open Science, Open to the World


Speech by Carlos Moedas, EU Commissioner for Research, Science and Innovation: “On 25 April this year, an earthquake of magnitude 7.3 hit Nepal. To get real-time geographical information, the response teams used an online mapping tool called Open Street Map. Open Street Map has created an entire online map of the world using local knowledge, GPS tracks and donated sources, all provided on a voluntary basis. It is open license for any use.

Open Street Map was created by a 24 year-old computer science student at University College London in 2004, has today 2 million users and has been used for many digital humanitarian and commercial purposes: From the earthquakes in Haiti and Nepal to the Ebola outbreak in West Africa.

This story is one of many that demonstrate that we are moving into a world of open innovation and user innovation. A world where the digital and physical are coming together. A world where new knowledge is created through global collaborations involving thousands of people from across the world and from all walks of life.

Ladies and gentlemen, over the next two days I would like us to chart a new path for European research and innovation policy. A new strategy that is fit for purpose for a world that is open, digital and global. And I would like to set out at the start of this important conference my own ambitions for the coming years….

Open innovation is about involving far more actors in the innovation process, from researchers, to entrepreneurs, to users, to governments and civil society. We need open innovation to capitalise on the results of European research and innovation. This means creating the right ecosystems, increasing investment, and bringing more companies and regions into the knowledge economy. I would like to go further and faster towards open innovation….

I am convinced that excellent science is the foundation of future prosperity, and that openness is the key to excellence. We are often told that it takes many decades for scientific breakthroughs to find commercial application.

Let me tell you a story which shows the opposite. Graphene was first isolated in the laboratory by Profs. Geim and Novoselov at the University of Manchester in 2003 (Nobel Prizes 2010). The development of graphene has since benefitted from major EU support, including ERC grants for Profs. Geim and Novoselov. So I am proud to show you one of the new graphene products that will soon be available on the market.

This light bulb uses the unique thermal dissipation properties of graphene to achieve greater energy efficiencies and a longer lifetime that LED bulbs. It was developed by a spin out company from the University of Manchester, called Graphene Lighting, as is expected to go on sale by the end of the year.

But we must not be complacent. If we look at indicators of the most excellent science, we find that Europe is not top of the rankings in certain areas. Our ultimate goal should always be to promote excellence not only through ERC and Marie Skłodowska-Curie but throughout the entire H2020.

For such an objective we have to move forward on two fronts:

First, we are preparing a call for European Science Cloud Project in order to identify the possibility of creating a cloud for our scientists. We need more open access to research results and the underlying data. Open access publication is already a requirement under Horizon 2020, but we now need to look seriously at open data…

When innovators like LEGO start fusing real bricks with digital magic, when citizens conduct their own R&D through online community projects, when doctors start printing live tissues for patients … Policymakers must follow suit…(More)”

In Brazil, missing persons posters automatically print in nearby homes


Springwise: “In Brazil, 200,000 people go missing each year and posters are still one of the most effective ways of locating them. Now, Mães da Sé NGO — the nonprofit dedicated to helping families find missing people — has partnered with HP for the Print for Help campaign. Using HP’s ePrint technology, which facilitates mobile and email printing, the campaign will help spread the word about missing individuals by automatically producing posters through nearby homes printers.

By registering printers with citizens’ email addresses and zip codes, Mães da Sé is able to email and automatically print posters in the area where a person went missing. Citizens can then assist with the search effort by displaying the poster in busy areas in their neighborhood, and create a network of helpers. By streamlining and speeding up the process of poster printing, the efforts in the first few hours, which are crucial in a search, can be maximized….(More)”.

Introducing the Governance Data Alliance


“The overall assumption of the Governance Data Alliance is that governance data can contribute to improved sustainable economic and human development outcomes and democratic accountability in all countries. The contribution that governance data will make to those outcomes will of course depend on a whole range of issues that will vary across contexts; development processes, policy processes, and the role that data plays vary considerably. Nevertheless, there are some core requirements that need to be met if data is to make a difference, and articulating them can provide a framework to help us understand and improve the impact that data has on development and accountability across different contexts.

We also collectively make another implicit (and important) assumption: that the current state of affairs is vastly insufficient when it comes to the production and usage of high-quality governance data. In other words, the status quo needs to be significantly improved upon. Data gathered from participants in the April 2014 design session help to paint that picture in granular terms. Data production remains highly irregular and ad hoc; data usage does not match data production in many cases (e.g. users want data that don’t exist and do not use data that is currently produced); production costs remain high and inconsistent across producers despite possibilities for economies of scale; and feedback loops between governance data producers and governance data users are either non-existent or rarely employed. We direct readers to http://dataalliance.globalintegrity.org for a fuller treatment of those findings.

Three requirements need to be met if governance data is to lead to better development and accountability outcomes, whether those outcomes are about core “governance” issues such as levels of inclusion, or about service delivery and human development outcomes that may be shaped by the quality of governance. Those requirements are:

  • The availability of governance data.
  • The quality of governance data, including its usability and salience.
  • The informed use of governance data.

(Or to use the metaphor of markets, we face a series of market failures: supply of data is inconsistent and not uniform; user demand cannot be efficiently channeled to suppliers to redirect their production to address those deficiencies; and transaction costs abound through non-existent data standards and lack of predictability.)

If data are not available about those aspects of governance that are expected to have an impact on development outcomes and democratic accountability, no progress will be made. The risk is that data about key issues will be lacking, or that there will be gaps in coverage, whether country coverage, time periods covered, or sectors, or that data sets produced by different actors may not be comparable. This might come about for reasons including the following: a lack of knowledge – amongst producers, and amongst producers and users – about what data is needed and what data is available; high costs, and limited resources to invest in generating data; and, institutional incentives and structures (e.g. lack of autonomy, inappropriate mandate, political suppression of sensitive data, organizational dysfunction – relating, for instance, to National Statistical Offices) that limit the production of governance data….

What A Governance Data Alliance Should Do (Or, Making the Market Work)

During the several months of creative exploration around possibilities for a Governance Data Alliance, dozens of activities were identified as possible solutions (in whole or in part) to the challenges identified above. This note identifies what we believe to be the most important and immediate activities that an Alliance should undertake, knowing that other activities can and should be rolled into an Alliance work plan in the out years as the initiative matures and early successes (and failures) are achieved and digested.

A brief summary of the proposals that follow:

  1. Design and implement a peer-to-peer training program between governance data producers to improve the quality and salience of existing data.
  2. Develop a lightweight data standard to be adopted by producer organizations to make it easier for users to consume governance data.
  3. Mine the 2014 Reform Efforts Survey to understand who actually uses which governance data, currently, around the world.
  4. Leverage the 2014 Reform Efforts Survey “plumbing” to field customized follow-up surveys to better assess what data users seek in future governance data.
  5. Pilot (on a regional basis) coordinated data production amongst producer organizations to fill coverage gaps, reduce redundancies, and respond to actual usage and user preferences….(More) “

Advancing Collaboration Theory: Models, Typologies, and Evidence


New book edited by John C. Morris, Katrina Miller-Stevens: “The term collaboration is widely used but not clearly understood or operationalized. However, collaboration is playing an increasingly important role between and across public, nonprofit, and for-profit sectors. Collaboration has become a hallmark in both intragovernmental and intergovernmental relationships. As collaboration scholarship rapidly emerges, it diverges into several directions, resulting in confusion about what collaboration is and what it can be used to accomplish. This book provides much needed insight into existing ideas and theories of collaboration, advancing a revised theoretical model and accompanying typologies that further our understanding of collaborative processes within the public sector.

Organized into three parts, each chapter presents a different theoretical approach to public problems, valuing the collective insights that result from honoring many individual perspectives. Case studies in collaboration, split across three levels of government, offer additional perspectives on unanswered questions in the literature. Contributions are made by authors from a variety of backgrounds, including an attorney, a career educator, a federal executive, a human resource administrator, a police officer, a self-employed entrepreneur, as well as scholars of public administration and public policy. Drawing upon the individual experiences offered by these perspectives, the book emphasizes the commonalities of collaboration. It is from this common ground, the shared experiences forged among seemingly disparate interactions that advances in collaboration theory arise.

Advancing Collaboration Theory offers a unique compilation of collaborative models and typologies that enhance the existing understanding of public sector collaboration….(More)”