Toward an Open Data Demand Assessment and Segmentation Methodology


Stefaan Verhulst and Andrew Young at IADB: “Across the world, significant time and resources are being invested in making government data accessible to all with the broad goal of improving people’s lives. Evidence of open data’s impact – on improving governance, empowering citizens, creating economic opportunity, and solving public problems – is emerging and is largely encouraging. Yet much of the potential value of open data remains untapped, in part because we often do not understand who is using open data or, more importantly, who is not using open data but could benefit from the insights it may generate. By identifying, prioritizing, segmenting, and engaging with the actual and future demand for open data in a systemic and systematic way, practitioners can ensure that open data is more targeted. Understanding and meeting the demand for open data can increase overall impact and return on investment of public funds.

The GovLab, in partnership with the Inter-American Development Bank, and with the support of the French Development Agency developed the Open Data Demand and Assessment Methodology to provide open data policymakers and practitioners with an approach for identifying, segmenting, and engaging with demand. This process specifically seeks to empower data champions within public agencies who want to improve their data’s ability to improve people’s lives….(More)”.

Saying yes to State Longitudinal Data Systems: building and maintaining cross agency relationships


Report by the National Skills Coalition: “In order to provide actionable information to stakeholders, state longitudinal data systems use administrative data that state agencies collect through administering programs. Thus, state longitudinal data systems must maintain strong working relationships with the state agencies collecting necessary administrative data. These state agencies can include K-12 and higher education agencies, workforce agencies, and those administering social service programs such as the Supplemental Nutrition Assistance Program or Temporary Assistance for Needy Families.

When state longitudinal data systems have strong relationships with agencies, agencies willingly and promptly share their data with the system, engage with data governance when needed, approve research requests in a timely manner, and continue to cooperate with the system over the long term. If state agencies do not participate with their state’s longitudinal data system, the work of the system is put into jeopardy. States may find that research and performance reporting can be stalled or stopped outright.

Kentucky and Virginia have been able to build and maintain support for their systems among state agencies. Their example demonstrates how states can effectively utilize their state longitudinal data systems….(More)”.

Mapping the challenges and opportunities of artificial intelligence for the conduct of diplomacy


DiploFoundation: “This report provides an overview of the evolution of diplomacy in the context of artificial intelligence (AI). AI has emerged as a very hot topic on the international agenda impacting numerous aspects of our political, social, and economic lives. It is clear that AI will remain a permanent feature of international debates and will continue to shape societies and international relations.

It is impossible to ignore the challenges – and opportunities – AI is bringing to the diplomatic realm. Its relevance as a topic for diplomats and others working in international relations will only increase….(More)”.

New Urban Centres Database sets new standards for information on cities at global scale


EU Science Hub: “Data analysis highlights very diverse development patterns and inequalities across cities and world regions.

Building on the Global Human Settlement Layer (GHSL), the new database provides more detailed information on the cities’ location and size as well as characteristics such as greenness, night time light emission, population size, the built-up areas exposed to natural hazards, and travel time to the capital city.

For several of these attributes, the database contains information recorded over time, dating as far back as 1975. 

Responding to a lack of consistent data, or data only limited to large cities, the Urban Centre Database now makes it possible to map, classify and count all human settlements in the world in a standardised way.

An analysis of the data reveals very different development patterns in the different parts of the world.

“The data shows that in the low-income countries, high population growth has resulted only into moderate increases in the built-up areas, while in the high-income countries, moderate population growth has resulted into very big increases in the built-up areas. In practice, cities have grown more in size in richer countries, with respect to poorer countries where the populations are growing faster”, said JRC researcher Thomas Kemper.

According to JRC scientists, around 75% of the global population now live in cities, towns or suburbs….

The City Centres Database provides new open data supporting the monitoring of UN Sustainable Development Goals, the UN’s New Urban Agenda and the Sendai Framework for Disaster Risk Reduction.

The main findings based on the Urban Centre Database are summarised in a new edition of the Atlas of the Human Planet, published together with the database….(More)”.

Survey: Majority of Americans Willing to Share Their Most Sensitive Personal Data


Center for Data Innovation: “Most Americans (58 percent) are willing to allow third parties to collect at least some sensitive personal data, according to a new survey from the Center for Data Innovation.

While many surveys measure public opinions on privacy, few ask consumers about their willingness to make tradeoffs, such as sharing certain personal information in exchange for services or benefits they want. In this survey, the Center asked respondents whether they would allow a mobile app to collect their biometrics or location data for purposes such as making it easier to sign into an account or getting free navigational help, and it asked whether they would allow medical researchers to collect sensitive data about their health if it would lead to medical cures for their families or others. Only one-third of respondents (33 percent) were unwilling to let mobile apps collect either their biometrics or location data under any of the described scenarios. And overall, nearly 6 in 10 respondents (58 percent) were willing to let a third party collect at least one piece of sensitive personal data, such as biometric, location, or medical data, in exchange for a service or benefit….(More)”.

How Data Sharing Can Improve Frontline Worker Development


Digital Promise: “Frontline workers, or the workers who interact directly with customers and provide services in industries like retail, healthcare, food service, and hospitality, help make up the backbone of today’s workforce.

However, frontline workforce talent development presents numerous challenges. Frontline workers may not be receiving the education and training they need to advance in their careers and sustain gainful employment. They also likely do not have access to data regarding their own skills and learning, and do not know what skills employers seek in quality workers.

Today, Digital Promise, a nonprofit authorized by Congress to support comprehensive research and development of programs to advance innovation in education, launched “Tapping Data for Frontline Talent Development,” a new, interactive report that shares how the seamless and secure sharing of data is key to creating more effective learning and career pathways for frontline service workers.

The research revealed that the current learning ecosystem that serves frontline workers—which includes stakeholders like education and training providers, funders, and employers—is complex, siloed, and removes agency from the worker.

Although many data types are collected, in today’s system much of the data is duplicative and rarely used to inform impact and long-term outcomes. The processes and systems in the ecosystem do not support the flow of data between stakeholders or frontline workers.

And yet, data sharing systems and collaborations are beginning to emerge as providers, funders, and employers recognize the power in data-driven decision-making and the benefits to data sharing. Not only can data sharing help to improve programs and services, it can create more personalized interventions for education providers supporting frontline workers, and it can also improve talent pipelines for employers.

In addition to providing three case studies with valuable examples of employersa community, and a state focused on driving change based on data, this new report identifies key recommendations that have the potential to move the current system toward a more data-driven, collaborative, worker-centered learning ecosystem, including:

  1. Creating awareness and demand among stakeholders
  2. Ensuring equity and inclusion for workers/learners through access and awareness
  3. Creating data sharing resources
  4. Advocating for data standards
  5. Advocating for policies and incentives
  6. Spurring the creation of technology systems that enable data sharing/interoperability

We invite you to read our new report today for more information, and sign up for updates on this important work….(More)”

Whatever happened to evidence-based policy making?


Speech by Professor Gary Banks: “One of the challenges in talking about EBPM (evidence-based policy making), which I had not fully appreciated last time, was that it means different things to different people, especially academics. As a result, disagreements, misunderstandings and controversies (or faux controversies) have abounded. And these may have contributed to the demise of the expression, if not the concept.

For example, some have interpreted the term EBPM so literally as to insist that the word “based” be replaced by “influenced”, arguing that policy decisions are rarely based on evidence alone. That of course is true, but few using the term (myself included) would have thought otherwise. And I am sure no-one in an audience such as this, especially in our nation’s capital, believes policy decisions could derive solely from evidence — or even rational analysis!

If you’ll pardon a quotation from my earlier address: “Values, interests, personalities, timing, circumstance and happenstance – in short, democracy – determine what actually happens” (EBPM: What is it? How do we get it?). Indeed it is precisely because of such multiple influences, that “evidence” has a potentially significant role to play.

So, adopting the position from Alice in Wonderland, I am inclined to stick with the term EBPM, which I choose to mean an approach to policy-making that makes systematic provision for evidence and analysis. Far from the deterministic straw man depicted in certain academic articles, it is an approach that seeks to achieve policy decisions that are better informed in a substantive sense, accepting that they will nevertheless ultimately be – and in a democracy need to be — political in nature.

A second and more significant area of debate concerns the meaning and value of “evidence” itself. There are a number of strands involved.

Evidentiary elitism?

One relates to methodology, and can be likened to the differences between the thresholds for a finding of guilt under civil and criminal law (“balance of probabilities” versus “beyond reasonable doubt”).

Some analysts have argued that, to be useful for policy, evidence must involve rigorous unbiased research techniques, the “gold standard” for which are “randomized control trials”. The “randomistas”, to use the term which headlines Andrew Leigh’s new book (Leigh, 2018), claim that only such a methodology is able to truly tell us “what works”

However adopting this exacting standard from the medical research world would leave policy makers with an excellent tool of limited application. Its forte is testing a specific policy or program relative to business as usual, akin to drug tests involving a placebo for a control group. And there are some inspiring examples of insights gained. But for many areas of public policy the technique is not practicable. Even where it is, it requires that a case has to some extent already been made. And while it can identify the extent to which a particular program “works”, it is less useful for understanding why, or whether something else might work even better.

That is not to say that any evidence will do. Setting the quality bar too low is the bigger problem in practice and the notion of a hierarchy of methodologies is helpful. However, no such analytical tools are self-sufficient for policy-making purposes and in my view are best thought of as components of a “cost benefit framework” – one that enables comparisons of different options, employing those estimation techniques that are most fit for purpose. Though challenging to populate fully with monetized data, CBA provides a coherent conceptual basis for assessing the net social impacts of different policy choices – which is what EBPM must aspire to as its contribution to (political) policy decisions….(More)”.

Blockchain’s Occam problem


Report by Matt Higginson, Marie-Claude Nadeau, and Kausik Rajgopal: “Blockchain has yet to become the game-changer some expected. A key to finding the value is to apply the technology only when it is the simplest solution available.

Blockchain over recent years has been extolled as a revolution in business technology. In the nine years since its launch, companies, regulators, and financial technologists have spent countless hours exploring its potential. The resulting innovations have started to reshape business processes, particularly in accounting and transactions.

Amid intense experimentation, industries from financial services to healthcare and the arts have identified more than 100 blockchain use cases. These range from new land registries, to KYC applications and smart contracts that enable actions from product processing to share trading. The most impressive results have seen blockchains used to store information, cut out intermediaries, and enable greater coordination between companies, for example in relation to data standards….

There is a clear sense that blockchain is a potential game-changer. However, there are also emerging doubts. A particular concern, given the amount of money and time spent, is that little of substance has been achieved. Of the many use cases, a large number are still at the idea stage, while others are in development but with no output. The bottom line is that despite billions of dollars of investment, and nearly as many headlines, evidence for a practical scalable use for blockchain is thin on the ground.

Infant technology

From an economic theory perspective, the stuttering blockchain development path is not entirely surprising. It is an infant technology that is relatively unstable, expensive, and complex. It is also unregulated and selectively distrusted. Classic lifecycle theory suggests the evolution of any industry or product can be divided into four stages: pioneering, growth, maturity, and decline (exhibit). Stage 1 is when the industry is getting started, or a particular product is brought to market. This is ahead of proven demand and often before the technology has been fully tested. Sales tend to be low and return on investment is negative. Stage 2 is when demand begins to accelerate, the market expands and the industry or product “takes off.”

Blockchain is struggling to emerge from the pioneering stage.
Exhibit

Across its many applications, blockchain arguably remains stuck at stage 1 in the lifecycle (with a few exceptions). The vast majority of proofs of concept (POCs) are in pioneering mode (or being wound up) and many projects have failed to get to Series C funding rounds.

One reason for the lack of progress is the emergence of competing technologies. In payments, for example, it makes sense that a shared ledger could replace the current highly intermediated system. However, blockchains are not the only game in town. Numerous fintechs are disrupting the value chain. Of nearly $12 billion invested in US fintechs last year, 60 percent was focused on payments and lending. SWIFT’s global payments innovation initiative (GPI), meanwhile, is addressing initial pain points through higher transaction speeds and increased transparency, building on bank collaboration….(More)” (See also: Blockchange)

A Study of the Implications of Advanced Digital Technologies (Including AI Systems) for the Concept of Responsibility Within a Human Rights Framework


Report by Karen Yeung: “This study was commissioned by the Council of Europe’s Committee of experts on human rights dimensions of automated data processing and different forms of artificial intelligence (MSI-AUT). It was prompted by concerns about the potential adverse consequences of advanced digital technologies (including artificial intelligence (‘AI’)), particularly their impact on the enjoyment of human rights and fundamental freedoms. This draft report seeks to examine the implications of these technologies for the concept of responsibility, and this includes investigating where responsibility should lie for their adverse consequences. In so doing, it seeks to understand (a) how human rights and fundamental freedoms protected under the ECHR may be adversely affected by the development of AI technologies and (b) how responsibility for those risks and consequences should be allocated. 

Its methodological approach is interdisciplinary, drawing on concepts and academic scholarship from the humanities, the social sciences and, to a more limited extent, from computer science. It concludes that, if we are to take human rights seriously in a hyperconnected digital age, we cannot allow the power of our advanced digital technologies and systems, and those who develop and implement them, to be accrued and exercised without responsibility. Nations committed to protecting human rights must therefore ensure that those who wield and derive benefits from developing and deploying these technologies are held responsible for their risks and consequences. This includes obligations to ensure that there are effective and legitimate mechanisms that will operate to prevent and forestall violations to human rights which these technologies may threaten, and to attend to the health of the larger collective and shared socio-technical environment in which human rights and the rule of law are anchored….(More)”.

Data Policy in the Fourth Industrial Revolution: Insights on personal data


Report by the World Economic Forum: “Development of comprehensive data policy necessarily involves trade-offs. Cross-border data flows are crucial to the digital economy. The use of data is critical to innovation and technology. However, to engender trust, we need to have appropriate levels of protection in place to ensure privacy, security and safety. Over 120 laws in effect across the globe today provide differing levels of protection for data but few anticipated 

Data Policy in the Fourth Industrial Revolution: Insights on personal data, a paper by the World Economic Forum in collaboration with the Ministry of Cabinet Affairs and the Future, United Arab Emirates, examines the relationship between risk and benefit, recognizing the impact of culture, values and social norms This work is a start toward developing a comprehensive data policy toolkit and knowledge repository of case studies for policy makers and data policy leaders globally….(More)”.