Data Stewards: Data Leadership to Address the Challenges of the 21st Century


Data Stewards_screenshot

The GovLab at the NYU Tandon School of Engineering is pleased to announce the launch of its Data Stewards website — a new portal for connecting organizations across sectors that seek to promote responsible data leadership that can address the challenges of the 21st century — developed with generous support from the William and Flora Hewlett Foundation.

Increasingly, the private sector is collaborating with the public sector and researchers on ways to use private-sector data and analytical expertise for public good. With these new practices of data collaborations come the need to reimagine roles and responsibilities to steer the process of using this data, and the insights it can generate, to address society’s biggest questions and challenges: Data Stewards.

Today, establishing and sustaining these new collaborative and accountable approaches requires significant and time-consuming effort and investment of resources for both data holders on the supply side, and institutions that represent the demand. By establishing Data Stewardship as a function — recognized within the private sector as a valued responsibility — the practice of Data Collaboratives can become more predictable, scaleable, sustainable and de-risked.

Together with BrightFront Group and Adapt, we are:

  • Exploring the needs and priorities of current private sector Data Stewards who act as change agents within their firms. Responsible for determining what, when, how and with whom to share private data for public good, these individuals are critical catalysts for ensuring insights are turned into action.
  • Identifying and connecting existing Data Stewards across sectors and regions to create an online and in-person community for exchanging knowledge and best practices.
  • Developing methodologies, tools and frameworks to use data more responsibly, systematically and efficiently to decrease the transaction cost, time and energy currently needed to establish Data Collaboratives.

To learn more about the Data Stewards Initiative, including new insights, ideas, tools and information about the Data Steward of the Year Award program, visit datastewards.net.

If you are a Data Steward, or would like to join a community of practice to learn from your peers, please contact datastewards@thegovlab.org to join the Network of Data Stewards.”

Data Stewards: Data Leadership to Address the Challenges of the 21st Century


Data Stewards_screenshot

The GovLab at the NYU Tandon School of Engineering is pleased to announce the launch of its Data Stewards website — a new portal for connecting organizations across sectors that seek to promote responsible data leadership that can address the challenges of the 21st century — developed with generous support from the William and Flora Hewlett Foundation.

Increasingly, the private sector is collaborating with the public sector and researchers on ways to use private-sector data and analytical expertise for public good. With these new practices of data collaborations come the need to reimagine roles and responsibilities to steer the process of using this data, and the insights it can generate, to address society’s biggest questions and challenges: Data Stewards.

Today, establishing and sustaining these new collaborative and accountable approaches requires significant and time-consuming effort and investment of resources for both data holders on the supply side, and institutions that represent the demand. By establishing Data Stewardship as a function — recognized within the private sector as a valued responsibility — the practice of Data Collaboratives can become more predictable, scaleable, sustainable and de-risked.

Together with BrightFront Group and Adapt, we are:

  • Exploring the needs and priorities of current private sector Data Stewards who act as change agents within their firms. Responsible for determining what, when, how and with whom to share private data for public good, these individuals are critical catalysts for ensuring insights are turned into action.
  • Identifying and connecting existing Data Stewards across sectors and regions to create an online and in-person community for exchanging knowledge and best practices.
  • Developing methodologies, tools and frameworks to use data more responsibly, systematically and efficiently to decrease the transaction cost, time and energy currently needed to establish Data Collaboratives.

To learn more about the Data Stewards Initiative, including new insights, ideas, tools and information about the Data Steward of the Year Award program, visit datastewards.net.

If you are a Data Steward, or would like to join a community of practice to learn from your peers, please contact datastewards@thegovlab.org to join the Network of Data Stewards.

For more information about The GovLab, visit thegovlab.org.

Using Blockchain Technology to Create Positive Social Impact


Randall Minas in Healthcare Informatics: “…Healthcare is yet another area where blockchain can make a substantial impact. Blockchain technology could be used to enable the WHO and CDC to better monitor disease outbreaks over time by creating distributed “ledgers” that are both secure and updated hundreds of times per day. Issued in near real-time, these updates would alert healthcare professionals to spikes in local cases almost immediately. Additionally, using blockchain would allow accurate diagnosis and streamline the isolation of clusters of cases as quickly as possible. Providing blocks of real-time disease information—especially in urban areas—would be invaluable.

In the United States, disease updates are provided in a Morbidity and Mortality Weekly Report (MMWR) from the CDC. This weekly report provides tables of current disease trends for hospitals and public health officials. Another disease reporting mechanism is the National Outbreak Reporting System (NORS), launched in 2009. NORS’ web-based tool provides outbreak data through 2016 and is accessible to the general public. There are two current weaknesses in the NORS reporting system and both can be addressed by blockchain technology.

The first issue lies in the number of steps required to accurately report each outbreak. A health department reports an outbreak to the NORS system, the CDC checks it for accuracy, analyzes the data, then provides a summary via the MMRW. Instantiating blockchain as the technology through which the NORS data is reported, every health department in the country could have preliminary data on disease trends at their fingertips without having to wait for the next MMRW publication.

The second issue is the inherent cybersecurity vulnerabilities using a web-based platform to monitor disease reporting. As we have seen with cyberattacks both domestic and abroad, cybersecurity vulnerabilities underlie most of our modern-day computing infrastructure. Blockchain was designed to be secure because it is decentralized across many computer networks and, since it was designed as a digital ledger, the previous data (or “blocks”) in the blockchain are difficult to alter.

While the NORS platform could be hacked with malware to gain access to our electricity and water infrastructure, instituting blockchain technology would limit the potential damage of the malware based on the inherent security of the technology. If this does not sound important, imagine the damage and ensuing panic that could be caused by a compromised NORS reporting a widespread Ebola outbreak.

The use of blockchain in monitoring epidemic outbreaks might not only apply to fast-spreading outbreaks like the flu, but also to epidemics that have lasted for decades. Since blockchain allows an unchangeable snapshot of data over time and can be anonymous, partner organizations could provide HIV test results to an individual’s “digital ledger” with a date of the test and the results.

Individuals could then exchange their HIV status securely, in an application, before engaging in high-risk behaviors. Since many municipalities provide free or low-cost, anonymous HIV testing, the use of blockchain would allow disease monitoring and exchange of status in a secure and trusted manner. The LGBTQ community and other high-risk communities could use an application to securely exchange HIV status with potential partners. With widespread adoption of this status-exchange system, an individual’s high-risk exposure could be limited, further reducing the spread of the epidemic.

While much of the creative application around blockchain has focused on supply chain-like models, including distribution of renewable energy and local sourcing of goods, it is important also to think innovatively about how blockchain can be used outside of supply chain and accounting.

In healthcare, blockchain has been discussed frequently in relation to electronic health records (EHRs), yet even that could be underappreciating the technology’s potential. Leaders in the blockchain arena should invest in application development for epidemic monitoring and disease control using blockchain technology. …(More)”.

Bringing The Public Back In: Can the Comment Process be Fixed?


Remarks of Commissioner Jessica Rosenworcel, US Federal Communications Commission: “…But what we are facing now does not reflect what has come before.  Because it is apparent the civic infrastructure we have for accepting public comment in the rulemaking process is not built for the digital age.  As the Administrative Conference of the United States acknowledges, while the basic framework for rulemaking from 1946 has stayed the same, “the technological landscape has evolved dramatically.”

Let’s call that an understatement.  Though this problem may seem small in the scheme of things, the impact is big.  Administrative decisions made in Washington affect so much of our day-to-day life.  They involve everything from internet openness to retirement planning to the availability of loans and the energy sources that power our homes and businesses.  So much of the decision making that affects our future takes place in the administrative state.

The American public deserves a fair shot at participating in these decisions.  Expert agencies are duty bound to hear from everyone, not just those who can afford to pay for expert lawyers and lobbyists.  The framework from the Administrative Procedure Act is designed to serve the public—by seeking their input—but increasingly they are getting shut out.  Our agency internet systems are ill-equipped to handle the mass automation and fraud that already is corrupting channels for public comment.  It’s only going to get worse.  The mechanization and weaponization of the comment-filing process has only just begun.

We need to something about it.  Because ensuring the public has a say in what happens in Washington matters.  Because trust in public institutions matters.  A few months ago Edelman released its annual Trust Barometer and reported than only a third of Americans trust the government—a 14 percentage point decline from last year.

Fixing that decline is worth the effort.  We can start with finding ways that give all Americans—no matter who they are or where they live—a fighting chance at making Washington listen to what they think.

We can’t give in to the easy cynicism that results when our public channels are flooded with comments from dead people, stolen identities, batches of bogus filings, and commentary that originated from Russian e-mail addresses.  We can’t let this deluge of fake filings further delegitimize Washington decisions and erode public trust.

No one said digital age democracy was going to be easy.  But we’ve got to brace ourselves and strengthen our civic infrastructure to withstand what is underway.  This is true at regulatory agencies—and across our political landscape.  Because if you look for them you will find uneasy parallels between the flood of fake comments in regulatory proceedings and the barrage of posts on social media that was part of a conspicuous campaign to influence our last election.  There is a concerted effort to exploit our openness.  It deserves a concerted response….(More)”

The world’s first neighbourhood built “from the internet up”


The Economist: “Quayside, an area of flood-prone land stretching for 12 acres (4.8 hectares) on Toronto’s eastern waterfront, is home to a vast, pothole-filled parking lot, low-slung buildings and huge soyabean silos—a crumbling vestige of the area’s bygone days as an industrial port. Many consider it an eyesore but for Sidewalk Labs, an “urban innovation” subsidiary of Google’s parent company, Alphabet, it is an ideal location for the world’s “first neighbourhood built from the internet up”.

Sidewalk Labs is working in partnership with Waterfront Toronto, an agency representing the federal, provincial and municipal governments that is responsible for developing the area, on a $50m project to overhaul Quayside. It aims to make it a “platform” for testing how emerging technologies might ameliorate urban problems such as pollution, traffic jams and a lack of affordable housing. Its innovations could be rolled out across an 800-acre expanse of the waterfront—an area as large as Venice.

Sidewalk Labs is planning pilot projects across Toronto this summer to test some of the technologies it hopes to employ at Quayside; this is partly to reassure residents. If its detailed plan is approved later this year (by Waterfront Toronto and also by various city authorities), it could start work at Quayside in 2020.

That proposal contains ideas ranging from the familiar to the revolutionary. There will be robots delivering packages and hauling away rubbish via underground tunnels; a thermal energy grid that does not rely on fossil fuels; modular buildings that can shift from residential to retail use; adaptive traffic lights; and snow-melting sidewalks. Private cars are banned; a fleet of self-driving shuttles and robotaxis would roam freely. Google’s Canadian headquarters would relocate there.

Undergirding Quayside would be a “digital layer” with sensors tracking, monitoring and capturing everything from how park benches are used to levels of noise to water use by lavatories. Sidewalk Labs says that collecting, aggregating and analysing such volumes of data will make Quayside efficient, liveable and sustainable. Data would also be fed into a public platform through which residents could, for example, allow maintenance staff into their homes while they are at work.

Similar “smart city” projects, such as Masdar in the United Arab Emirates or South Korea’s Songdo, have spawned lots of hype but are not seen as big successes. Many experience delays because of shifting political and financial winds, or because those overseeing their construction fail to engage locals in the design of communities, says Deland Chan, an expert on smart cities at Stanford University. Dan Doctoroff, the head of Sidewalk Labs, who was deputy to Michael Bloomberg when the latter was mayor of New York City, says that most projects flop because they fail to cross what he terms “the urbanist-technologist divide”.

That divide, between tech types and city-planning specialists, will also need to be bridged before Sidewalk Labs can stick a shovel in the soggy ground at Quayside. Critics of the project worry that in a quest to become a global tech hub, Toronto’s politicians may give it too much freedom. Sidewalk Labs’s proposal notes that the project needs “substantial forbearances from existing [city] laws and regulations”….(More)”.

Government to establish a ‘National Data Commissioner’


Rohan Pearce at Computerworld – Australia: “A new position of the ‘National Data Commissioner’ will be established as part of a $65 million, four-year open data push by the federal government.

The creation of the new position is part of the government’s response to the Productivity Commission inquiry into the availability and use of public and private data by individuals and organisations.

The government in November revealed that it would legislate a new Consumer Data Right as part of its response to the PC’s recommendations. The government said that this will allow individuals to access data relating to their banking, energy, phone and Internet usage, potentially making it easier to compare and switch between service providers.

The Office of the Australian Information Commissioner and the Australian Competition and Consumer Commission will have oversight of the Consumer Data Right.

The government said today it would introduce a new data sharing and release framework to streamline the way government data is made available for use by researchers and public and private sector organisations.

The framework’s aim will be to promote the greater use of data and drive related economic and innovation benefits as well as to “Build trust with the Australian community about the government’s use of data”.

The government said it would push a risk-based approach to releasing publicly funded data sets.

The National Data Commissioner will be supported by a National Data Advisory Council. The council “will advise the National Data Commissioner on ethical data use, technical best practice, and industry and international developments.”…(More).

Behavioral Economics: Are Nudges Cost-Effective?


Carla Fried at UCLA Anderson Review: “Behavioral science does not suffer from a lack of academic focus. A Google Scholar search for the term delivers more than three million results.

While there is an abundance of research into how human nature can muck up our decision making process and the potential for well-placed nudges to help guide us to better outcomes, the field has kept rather mum on a basic question: Are behavioral nudges cost-effective?

That’s an ever more salient question as the art of the nudge is increasingly being woven into public policy initiatives. In 2009, the Obama administration set up a nudge unit within the White House Office of Information and Technology, and a year later the U.K. government launched its own unit. Harvard’s Cass Sunstein, co-author of the book Nudge, headed the U.S. effort. His co-author, the University of Chicago’s Richard Thaler — who won the 2017 Nobel Prize in Economics — helped develop the U.K.’s Behavioral Insights office. Nudge units are now humming away in other countries, including Germany and Singapore, as well as at the World Bank, various United Nations agencies and the Organisation for Economic Co-operation and Development (OECD).

Given the interest in the potential for behavioral science to improve public policy outcomes, a team of nine experts, including UCLA Anderson’s Shlomo Benartzi, Sunstein and Thaler, set out to explore the cost-effectiveness of behavioral nudges relative to more traditional forms of government interventions.

In addition to conducting their own experiments, the researchers looked at published research that addressed four areas where public policy initiatives aim to move the needle to improve individuals’ choices: saving for retirement, applying to college, energy conservation and flu vaccinations.

For each topic, they culled studies that focused on both nudge approaches and more traditional mandates such as tax breaks, education and financial incentives, and calculated cost-benefit estimates for both types of studies. Research used in this study was published between 2000 and 2015. All cost estimates were inflation-adjusted…

The study itself should serve as a nudge for governments to consider adding nudging to their policy toolkits, as this approach consistently delivered a high return on investment, relative to traditional mandates and policies….(More)”.

Algorithmic Impact Assessment (AIA) framework


Report by AINow Institute: “Automated decision systems are currently being used by public agencies, reshaping how criminal justice systems work via risk assessment algorithms1 and predictive policing, optimizing energy use in critical infrastructure through AI-driven resource allocation, and changing our employment4 and educational systems through automated evaluation tools and matching algorithms.Researchers, advocates, and policymakers are debating when and where automated decision systems are appropriate, including whether they are appropriate at all in particularly sensitive domains.

Questions are being raised about how to fully assess the short and long term impacts of these systems, whose interests they serve, and if they are sufficiently sophisticated to contend with complex social and historical contexts. These questions are essential, and developing strong answers has been hampered in part by a lack of information and access to the systems under deliberation. Many such systems operate as “black boxes” – opaque software tools working outside the scope of meaningful scrutiny and accountability.8 This is concerning, since an informed policy debate is impossible without the ability to understand which existing systems are being used, how they are employed, and whether these systems cause unintended consequences. The Algorithmic Impact Assessment (AIA) framework proposed in this report is designed to support affected communities and stakeholders as they seek to assess the claims made about these systems, and to determine where – or if – their use is acceptable….

KEY ELEMENTS OF A PUBLIC AGENCY ALGORITHMIC IMPACT ASSESSMENT

1. Agencies should conduct a self-assessment of existing and proposed automated decision systems, evaluating potential impacts on fairness, justice, bias, or other concerns across affected communities;

2. Agencies should develop meaningful external researcher review processes to discover, measure, or track impacts over time;

3. Agencies should provide notice to the public disclosing their definition of “automated decision system,” existing and proposed systems, and any related self-assessments and researcher review processes before the system has been acquired;

4. Agencies should solicit public comments to clarify concerns and answer outstanding questions; and

5. Governments should provide enhanced due process mechanisms for affected individuals or communities to challenge inadequate assessments or unfair, biased, or otherwise harmful system uses that agencies have failed to mitigate or correct….(More)”.

Can we solve wicked problems?


Paper by Gianluca Elia and Alessandro Margherita describing “A conceptual framework and a collective intelligence system to support problem analysis and solution design for complex social issues…Wicked problems are complex and multifaceted issues that have no single solution, and are perceived by different stakeholders through contrasting views. Examples in the social context include climate change, poverty, energy production, sanitation, sustainable cities, pollution and homeland security.

Extant research has been addressed to support open discussion and collaborative decision making in wicked scenarios, but complexities derive from the difficulty to leverage multiple contributions, coming from both experts and non-experts, through a structured approach.

In such view, we present a conceptual framework for the study of wicked problem solving as a complex and multi-stakeholder process. Afterwards, we describe an integrated system of tools and associated operational guidelines aimed to support collective problem analysis and solution design. The main value of the article is to highlight the relevance of collective approaches in the endeavor of wicked problem resolution, and to provide an integrated framework of activities, actors and purposeful tools….(More)”.

 

Artificial Intelligence and the Need for Data Fairness in the Global South


Medium blog by Yasodara Cordova: “…The data collected by industry represents AI opportunities for governments, to improve their services through innovation. Data-based intelligence promises to increase the efficiency of resource management by improving transparency, logistics, social welfare distribution — and virtually every government service. E-government enthusiasm took of with the realization of the possible applications, such as using AI to fight corruption by automating the fraud-tracking capabilities of cost-control tools. Controversially, the AI enthusiasm has spread to the distribution of social benefits, optimization of tax oversight and control, credit scoring systems, crime prediction systems, and other applications based in personal and sensitive data collection, especially in countries that do not have comprehensive privacy protections.

There are so many potential applications, society may operate very differently in ten years when the “datafixation” has advanced beyond citizen data and into other applications such as energy and natural resource management. However, many countries in the Global South are not being given necessary access to their countries’ own data.

Useful data are everywhere, but only some can take advantage. Beyond smartphones, data can be collected from IoT components in common spaces. Not restricted to urban spaces, data collection includes rural technology like sensors installed in tractors. However, even when the information is related to issues of public importance in developing countries —like data taken from road mesh or vital resources like water and land — it stays hidden under contract rules and public citizens cannot access, and therefore take benefit, from it. This arrangement keeps the public uninformed about their country’s operations. The data collection and distribution frameworks are not built towards healthy partnerships between industry and government preventing countries from realizing the potential outlined in the previous paragraph.

The data necessary to the development of better cities, public policies, and common interest cannot be leveraged if kept in closed silos, yet access often costs more than is justifiable. Data are a primordial resource to all stages of new technology, especially tech adoption and integration, so the necessary long term investment in innovation needs a common ground to start with. The mismatch between the pace of the data collection among big established companies and small, new, and local businesses will likely increase with time, assuming no regulation is introduced for equal access to collected data….

Currently, data independence remains restricted to discussions on the technological infrastructure that supports data extraction. Privacy discussions focus on personal data rather than the digital accumulation of strategic data in closed silos — a necessary discussion not yet addressed. The national interest of data is not being addressed in a framework of economic and social fairness. Access to data, from a policy-making standpoint, needs to find a balance between the extremes of public, open access and limited, commercial use.

A final, but important note: the vast majority of social media act like silos. APIs play an important role in corporate business models, where industry controls the data it collects without reward, let alone user transparency. Negotiation of the specification of APIs to make data a common resource should be considered, for such an effort may align with the citizens’ interest….(More)”.