Disinformation Rated As Significant of a Problem As Gun Violence and Terrorism

Report by the Institute for Public Relations: “Sixty-three percent of Americans view disinformation—deliberately biased and misleading information—as a “major” problem in society, on par with gun violence (63%) and terrorism (66%), according to the 2019 Institute for Public Relations Disinformation in Society Report.

The 2019 IPR Disinformation in Society Report surveyed 2,200 adults to determine the prevalence of disinformation, who is responsible for sharing disinformation, the level of trust in different information sources, and the parties responsible for combatting disinformation.

“One surprising finding was how significant of a problem both Republicans and Democrats rated disinformation,” said Dr. Tina McCorkindale, APR, President and CEO of the Institute for Public Relations. “Unfortunately, only a few organizations outside of the media literacy and news space devote resources to help fix it, including many of the perceived culprits responsible for spreading disinformation.”

More than half (51%) of the respondents said they encounter disinformation at least once a day, while 78% said they see it once a week. Four in five adults (80%) said they are confident in their ability to recognize false news and information. Additionally, nearly half of Americans (47%) said they “often” or “always” go to other sources to see if news and information are accurate….(More)”.

Transparency in the EU Institutions – An Overview

Paper by Gianluca Sgueo: “The concepts of transparency of the public sector has been in existence, in various forms, for centuries. Academics, however, agree on the fact that transparency should be qualified as a modern concept. The wave of government reforms that occurred in the 1950s and the 1960s fostered the culture of transparent, accessible and accountable bureaucracies. In the 1990s, following on the spread of technologies, terms like “Government 2.0” and “open government” were coined to describe the use that public administrations made of the Internet and other digital tools in order to foster civic engagement, improve transparency, and enhance the efficiency of government services.

Transparency has come to the fore again over the past few years. National and supranational regulators (including the European Union) have placed transparency among the priorities in their regulatory agendas. Enhanced transparency in decision-making is considered to be a solution to the decline of trust in the public sector, a limit to the negative impact of conspiracy theories and fake news, and also a way to revitalise civic engagement.

EU institutions promote transparency across different lines of action. Exemplary are the ongoing debates on reforming the legislative procedure of the Union, regulating lobbying activities, making available data in open format and digitalising services. Also relevant is the role of the European Ombudsman in promoting a culture of transparency at the EU level. 

Studies suggest that transparency and participation in public governance are having a positive impact on the accountability of EU institutions, and hence on citizens’ perceptions of their activities. The present briefing offers an overview of the actions that EU institutions are implementing to foster transparency, analyzing the potential benefits and briefly discussing its possible drawbacks…(More)”.

Smarter Select Committees

Theo Bass at Nesta: “This report outlines how digital tools and methods can help select committees restore public trust in democracy, reinvigorate public engagement in Parliament and enhance the work of committees themselves.

Since their establishment in 1979, select committees have provided one of our most important democratic functions. At their best, committees gather available evidence, data and insight; tap into public experiences and concerns; provide a space for thoughtful deliberation; and help parliament make better decisions. However, the 40th anniversary of select committees presents an important opportunity to re-examine this vital parliamentary system to ensure they are fit for the 21st century.

Since 2012 select committees have committed to public engagement as a ‘core task’ of their work, but their approach has not been systematic and they still struggle to reach beyond the usual suspects, or find ways to gather relevant knowledge quickly and effectively. With public trust in democracy deteriorating, the imperative to innovate, improve legitimacy and find new ways to involve people in national politics is stronger than ever. This is where digital innovation can help.

If used effectively, digital tools and methods offer select committees the opportunity to be more transparent and accessible to a wider range of people, improving relevance and impact. Like any good public engagement, this needs careful design, without which digital participation risks being distorting and unhelpful, amplifying the loudest or least informed voices.

To achieve success, stronger ambition and commitment by senior staff and MPs, as well as experimentation and learning through trial and improvement will be essential. We recommend that the UK Parliament commits to running at least five pilots for digital participation, which we outline in more detail in the final section of this report….(More)”.

Crowdsourcing and Crisis Mapping in Complex Emergencies

Guidance paper by Andrew Skuse: “…examines the use of crowdsourcing and crisis mapping during complex emergencies. Crowdsourcing is a process facilitated by new information and communication technologies (ICTs), social media platforms and dedicated software programs. It literally seeks the help of ‘the crowd’, volunteers or the general public, to complete a series of specific tasks such as data collection, reporting, document contribution and so on. Crowdsourcing is important in emergency situations because it allows for a critical link to be forged between those affected by an emergency and those who are responding to it. Crowdsourcing is often used by news organisations to gather information, i.e. citizen journalism, as well as by organisations concerned with emergencies and humanitarian aid, i.e. International Committee of the Red Cross, the Standby Task Force and CrisisCommons. Here, crowdsourced data on voting practices and electoral violence, as well as the witnessing of human rights contraventions are helping to improve accountability and transparency in fragile or conflict-prone states. Equally, crowdsourcing facilitates the sharing of individual and collective experiences, the gathering of specialized knowledge, the undertaking of collective mapping tasks and the engagement of the public through ‘call-outs’ for information…(More)”.

Open Data Retrospective

Laura Bacon at Luminate:: “Our global philanthropic organisation – previously the Government & Citizen Engagement (GCE) initiative at Omidyar Network, now Luminate – has been active in the open data space for over decade. In that time, we have invested more than $50m in organisations and platforms that are working to advance open data’s potential, including Open Data Institute, IMCO, Open Knowledge, ITS Rio, Sunlight, GovLab, Web Foundation, Open Data Charter, and Open Government Partnership.

Ahead of our transition from GCE to Luminate last year, we wanted to take a step back and assess the field in order to cultivate a richer understanding of the evolution of open data—including its critical developments, drivers of change, and influential actors[1]. This research would help inform our own strategy and provide valuable insight that we can share with the broader open data ecosystem. 

First, what is open data? Open data is data that can be freely used, shared, and built-upon by anyone, anywhere, for any purpose. At its best, open government data can empower citizens, improve governments, create opportunities, and help solve public problems. Have you used a transport app to find out when the next bus will arrive? Or a weather app to look up a forecast? When using a real estate website to buy or rent a home, have you also reviewed its proximity to health, education, and recreational facilities or checked out neighborhood crime rates? If so, your life has been impacted by open data. 

The Open Data Retrospective

We commissioned Dalberg, a global strategic advisory firm, to conduct an Open Data Retrospective to explore: ‘how and why did the open data field evolve globally over the past decade?’ as well as ‘where is the field today?’ With the concurrent release of the report “The State of Open Data” – led by IDRC and Open Data for Development initiative – we thought this would be a great time to make public the report we’d commissioned. 

You can see Dalberg’s open data report here, and its affiliated data here. Please note, this presentation is a modification of the report. Several sections and slides have been removed for brevity and/or confidentiality. Therefore, some details about particular organisations and strategies are not included in this deck.

Evolution and impact

Dalberg’s report covers the trajectory of the open data field and characterised it as: inception (pre-2008), systematisation (2009-2010), expansion (2011-2015), and reevaluation (2016-2018).This characterisation varies by region and sector, but generally captures the evolution of the open data movement….(More)”.

The Age of Digital Interdependence

Report of the High-level Panel on Digital Cooperation: “The immense power and value of data in the modern economy can and must be harnessed to meet the SDGs, but this will require new models of collaboration. The Panel discussed potential pooling of data in areas such as health, agriculture and the environment to enable scientists and thought leaders to use data and artificial intelligence to better understand issues and find new ways to make progress on the SDGs. Such data commons would require criteria for establishing relevance to the SDGs, standards for interoperability, rules on access and safeguards to ensure privacy and security.

Anonymised data – information that is rendered anonymous in such a way that the data subject is not or no longer identifiable – about progress toward the SDGs is generally less sensitive and controversial than the use of personal data of the kind companies such as Facebook, Twitter or Google may collect to drive their business models, or facial and gait data that could be used for surveillance. However, personal data can also serve development goals, if handled with proper oversight to ensure its security and privacy.

For example, individual health data is extremely sensitive – but many people’s health data, taken together, can allow researchers to map disease outbreaks, compare the effectiveness of treatments and improve understanding of conditions. Aggregated data from individual patient cases was crucial to containing the Ebola outbreak in West Africa. Private and public sector healthcare providers around the world are now using various forms of electronic medical records. These help individual patients by making it easier to personalise health services, but the public health benefits require these records to be interoperable.

There is scope to launch collaborative projects to test the interoperability of data, standards and safeguards across the globe. The World Health Assembly’s consideration of a global strategy for digital health in 2020 presents an opportunity to launch such projects, which could initially be aimed at global health challenges such as Alzheimer’s and hypertension.

Improved digital cooperation on a data-driven approach to public health has the potential to lower costs, build new partnerships among hospitals, technology companies, insurance providers and research institutes and support the shift from treating diseases to improving wellness. Appropriate safeguards are needed to ensure the focus remains on improving health care outcomes. With testing, experience and necessary protective measures as well as guidelines for the responsible use of data, similar cooperation could emerge in many other fields related to the SDGs, from education to urban planning to agriculture…(More)”.

100 Radical Innovation Breakthroughs for the future

The Radical Innovation Breakthrough Inquirer for the European Commission: “This report provides insights on 100 emerging developments that may exert a strong impact on global value creation and offer important solutions to societal needs. We identified this set of emerging developments through a carefully designed procedure that combined machine learning algorithms and human evaluation. After successive waves of selection and refinement, the resulting 100 emerging topics were subjected to several assessment procedures, including expert consultation and analysis of related patents and publications.

Having analysed the potential importance of each of these innovations for Europe, their current maturity and the relative strength of Europe in related R&D, we can make some general policy recommendations that follow.

However, it is important to note that our recommendations are based on the extremes of the distributions, and thus not all RIBs are named under the recommendations. Yet, the totality of the set of Radical Innovation Breakthrough (RIBs) and Radical Societal Breakthrough (RSBs) descriptions and their recent progress directions constitute an important collection of intelligence material that can inform strategic planning in research an innovation policy, industry and enterprise policy, and local development policy….(More)”.

The New York Times has a course to teach its reporters data skills, and now they’ve open-sourced it

Joshua Benton at Nieman Labs: “The New York Times wants more of its journalists to have those basic data skills, and now it’s releasing the curriculum they’ve built in-house out into the world, where it can be of use to reporters, newsrooms, and lots of other people too.

Here’s Lindsey Rogers Cook, an editor for digital storytelling and training at the Times, and the sort of person who is willing to have “spreadsheets make my heart sing” appear under her byline:

Even with some of the best data and graphics journalists in the business, we identified a challenge: data knowledge wasn’t spread widely among desks in our newsroom and wasn’t filtering into news desks’ daily reporting.

Yet fluency with numbers and data has become more important than ever. While journalists once were fond of joking that they got into the field because of an aversion to math, numbers now comprise the foundation for beats as wide-ranging as education, the stock market, the Census, and criminal justice. More data is released than ever before — there are nearly 250,000 datasets on data.govalone — and increasingly, government, politicians, and companies try to twist those numbers to back their own agendas…

We wanted to help our reporters better understand the numbers they get from sources and government, and give them the tools to analyze those numbers. We wanted to increase collaboration between traditional and non-traditional journalists…And with more competition than ever, we wanted to empower our reporters to find stories lurking in the hundreds of thousands of databases maintained by governments, academics, and think tanks. We wanted to give our reporters the tools and support necessary to incorporate data into their everyday beat reporting, not just in big and ambitious projects.

….You can access the Times’ training materials here. Some of what you’ll find:

  • An outline of the data skills the course aims to teach. It’s all run on Google Docs and Google Sheets; class starts with the uber-basics (mean! median! sum!), crosses the bridge of pivot tables, and then heads into data cleaning and more advanced formulas.
  • The full day-by-day outline of the Times’ three-week course, which of course you’re free to use or reshape to your newsroom’s needs.
  • It’s not just about cells, columns, and rows — the course also includes more journalism-based information around ethical questions, how to use data effectively inside a story’s narrative, and how best to work with colleagues in the graphic department.
  • Cheat sheets! If you don’t have time to dig too deeply, they’ll give a quick hit of information: onetwothreefourfive.
  • Data sets that you use to work through the beginner, intermediate, and advanced stages of the training, including such journalism classics as census datacampaign finance data, and BLS data.But don’t be a dummy and try to write real news stories off these spreadsheets; the Times cautions in bold: “NOTE: We have altered many of these datasets for instructional purposes, so please download the data from the original source if you want to use it in your reporting.”
  • How Not To Be Wrong,” which seems like a useful thing….(More)”

Return on Data

Paper by Noam Kolt: “Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply — “return on data” (ROD) — remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals?

Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data. They treat data protection as a standalone issue, distinct from the benefits which consumers receive. This article suggests that privacy concerns should not be viewed in isolation, but as part of ROD. Just as companies can quantify return on investment (ROI) to optimize investment decisions, consumers should be able to assess ROD in order to better spend and invest personal data. Making data-for-services transactions more transparent will enable consumers to evaluate the merits of these deals, negotiate their terms and make more informed decisions. Pivoting from the privacy paradigm to ROD will both incentivize data-driven service providers to offer consumers higher ROD, as well as create opportunities for new market entrants….(More)”.

Federal Data Strategy: Use Cases

US Federal Data Strategy: “For the purposes of the Federal Data Strategy, a “Use Case” is a data practice or method that leverages data to support an articulable federal agency mission or public interest outcome. The Federal Data Strategy sought use cases from the public that solve problems or demonstrate solutions that can help inform the four strategy areas: Enterprise Data Governance; Use, Access, and Augmentation; Decision-making and Accountability; and Commercialization, Innovation, and Public Use. The Federal Data Strategy team was in part informed by these submissions, which are posted below…..(More)”.