The State of Open Data Portals in Latin America


Michael Steinberg at Center for Data Innovation: “Many Latin American countries publish open data—government data made freely available online in machine-readable formats and without license restrictions. However, there is a tremendous amount of variation in the quantity and type of datasets governments publish on national open data portals—central online repositories for open data that make it easier for users to find data. Despite the wide variation among the countries, the most popular datasets tend to be those that either provide transparency into government operations or offer information that citizens can use directly. As governments continue to update and improve their open data portals, they should take steps to ensure that they are publishing the datasets most valuable to their citizens.

To better understand this variation, we collected information about open data portals in 20 Latin American countries including Argentina, Bolivia, Brazil, Chile, Colombia, Costa Rica, Ecuador, Mexico, Panama, Paraguay, Peru, and Uruguay. Not all Latin American countries have an open data portal, but even if they do not operate a unified portal, some governments may still have open data. Four Latin American countries—Belize, Guatemala, Honduras, and Nicaragua—do not have open data portals. One country— El Salvador—does not have a government-run open data portal, but does have a national open data portal (datoselsalvador.org) run by volunteers….

There are many steps Latin American governments can take to improve open data in their country. Those nations without open data portals should create them, and those who already have them should continue to update them and publish more datasets to better serve their constituents. One way to do this is to monitor the popular datasets on other countries’ open data portals, and where applicable, ensure the government produces similar datasets. Those running open data portals should also routinely monitor search queries to see what users are looking for, and if they are looking for datasets that have not yet been posted, work with the relevant government agencies to make these datasets available.

In summary, there are stark differences in the amount of data published, the format of the data, and the most popular datasets in open data portals in Latin America. However, in every country there is an appetite for data that either provides public accountability for government functions or supplies helpful information to citizens…(More)”.

The Right of Access to Public Information


Book by Hermann-Josef Blanke and Ricardo Perlingeiro: “This book presents a comparative study on access to public information in the context of the main legal orders worldwide. The international team of authors analyzes the Transparency- and Freedom-to-Information legislation with regard to the scope of the right to access, limitations of this right inherent in the respective national laws, the procedure, the relationship with domestic legislation on administrative procedure, as well as judicial protection. It particularly focuses on the Brazilian law of access to information, which is interpreted as a benchmark for regulations in other Latin-American states….(More)”

Index: Collective Intelligence


By Hannah Pierce and Audrie Pirkl

The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on collective intelligence and was originally published in 2017.

The Collective Intelligence Universe

  • Amount of money that Reykjavik’s Better Neighbourhoods program has provided each year to crowdsourced citizen projects since 2012: € 2 million (Citizens Foundation)
  • Number of U.S. government challenges that people are currently participating in to submit their community solutions: 778 (Challenge.gov).
  • Percent of U.S. arts organizations used social media to crowdsource ideas in 2013, from programming decisions to seminar scheduling details: 52% (Pew Research)
  • Number of Wikipedia members who have contributed to a page in the last 30 days: over 120,000 (Wikipedia Page Statistics)
  • Number of languages that the multinational crowdsourced Letters for Black Lives has been translated into: 23 (Letters for Black Lives)
  • Number of comments in a Reddit thread that established a more comprehensive timeline of the theater shooting in Aurora than the media: 1272 (Reddit)
  • Number of physicians that are members of SERMO, a platform to crowdsource medical research: 800,000 (SERMO)
  • Number of citizen scientist projects registered on SciStarter: over 1,500 (Collective Intelligence 2017 Plenary Talk: Darlene Cavalier)
  • Entrants to NASA’s 2009 TopCoder Challenge: over 1,800 (NASA)

Infrastructure

  • Number of submissions for Block Holm (a digital platform that allows citizens to build “Minecraft” ideas on vacant city lots) within the first six months: over 10,000 (OpenLearn)
  • Number of people engaged to The Participatory Budgeting Project in the U.S.: over 300,000. (Participatory Budgeting Project)
  • Amount of money allocated to community projects through this initiative: $238,000,000

Health

  • Percentage of Internet-using adults with chronic health conditions that have gone online within the US to connect with others suffering from similar conditions: 23% (Pew Research)
  • Number of posts to Patient Opinion, a UK based platform for patients to provide anonymous feedback to healthcare providers: over 120,000 (Nesta)
    • Percent of NHS health trusts utilizing the posts to improve services in 2015: 90%
    • Stories posted per month: nearly 1,000 (The Guardian)
  • Number of tumors reported to the English National Cancer Registration each year: over 300,000 (Gov.UK)
  • Number of users of an open source artificial pancreas system: 310 (Collective Intelligence 2017 Plenary Talk: Dana Lewis)

Government

  • Number of submissions from 40 countries to the 2017 Open (Government) Contracting Innovation Challenge: 88 (The Open Data Institute)
  • Public-service complaints received each day via Indonesian digital platform Lapor!: over 500 (McKinsey & Company)
  • Number of registered users of Unicef Uganda’s weekly, SMS poll U-Report: 356,468 (U-Report)
  • Number of reports regarding government corruption in India submitted to IPaidaBribe since 2011: over 140,000 (IPaidaBribe)

Business

  • Reviews posted since Yelp’s creation in 2009: 121 million reviews (Statista)
  • Percent of Americans in 2016 who trust online customer reviews as much as personal recommendations: 84% (BrightLocal)
  • Number of companies and their subsidiaries mapped through the OpenCorporates platform: 60 million (Omidyar Network)

Crisis Response

Public Safety

  • Number of sexual harassment reports submitted to from 50 cities in India and Nepal to SafeCity, a crowdsourcing site and mobile app: over 4,000 (SafeCity)
  • Number of people that used Facebook’s Safety Check, a feature that is being used in a new disaster mapping project, in the first 24 hours after the terror attacks in Paris: 4.1 million (Facebook)

Examining the Mistrust of Science


Proceedings of a National Academies Workshop: “The Government-University-Industry Research Roundtable held a meeting on February 28 and March 1, 2017, to explore trends in public opinion of science, examine potential sources of mistrust, and consider ways that cross-sector collaboration between government, universities, and industry may improve public trust in science and scientific institutions in the future. The keynote address on February 28 was given by Shawn Otto, co-founder and producer of the U.S. Presidential Science Debates and author of The War on Science.

“There seems to be an erosion of the standing and understanding of science and engineering among the public,” Otto said. “People seem much more inclined to reject facts and evidence today than in the recent past. Why could that be?” Otto began exploring that question after the candidates in the 2008 presidential election declined an invitation to debate science-driven policy issues and instead chose to debate faith and values.

“Wherever the people are well-informed, they can be trusted with their own government,” wrote Thomas Jefferson. Now, some 240 years later, science is so complex that it is difficult even for scientists and engineers to understand the science outside of their particular fields. Otto argued,

“The question is, are people still well-enough informed to be trusted with their own government? Of the 535 members of Congress, only 11—less than 2 percent—have a professional background in science or engineering. By contrast, 218—41 percent—are lawyers. And lawyers approach a problem in a fundamentally different way than a scientist or engineer. An attorney will research both sides of a question, but only so that he or she can argue against the position that they do not support. A scientist will approach the question differently, not starting with a foregone conclusion and arguing towards it, but examining both sides of the evidence and trying to make a fair assessment.”

According to Otto, anti-science positions are now acceptable in public discourse, in Congress, state legislatures and city councils, in popular culture, and in presidential politics. Discounting factually incorrect statements does not necessarily reshape public opinion in the way some trust it to. What is driving this change? “Science is never partisan, but science is always political,” said Otto. “Science takes nothing on faith; it says, ‘show me the evidence and I’ll judge for myself.’ But the discoveries that science makes either confirm or challenge somebody’s cherished beliefs or vested economic or ideological interests. Science creates knowledge—knowledge is power, and that power is political.”…(More)”.

Big Data: A Twenty-First Century Arms Race


Report by Atlantic Council and Thomson Reuters: “We are living in a world awash in data. Accelerated interconnectivity, driven by the proliferation of internet-connected devices, has led to an explosion of data—big data. A race is now underway to develop new technologies and implement innovative methods that can handle the volume, variety, velocity, and veracity of big data and apply it smartly to provide decisive advantage and help solve major challenges facing companies and governments

For policy makers in government, big data and associated technologies like machine-learning and artificial Intelligence, have the potential to drastically improve their decision-making capabilities. How governments use big data may be a key factor in improved economic performance and national security. This publication looks at how big data can maximize the efficiency and effectiveness of government and business, while minimizing modern risks. Five authors explore big data across three cross-cutting issues: security, finance, and law.

Chapter 1, “The Conflict Between Protecting Privacy and Securing Nations,” Els de Busser
Chapter 2, “Big Data: Exposing the Risks from Within,” Erica Briscoe
Chapter 3, “Big Data: The Latest Tool in Fighting Crime,” Benjamin Dean, Fellow
Chapter 4, “Big Data: Tackling Illicit Financial Flows,” Tatiana Tropina
Chapter 5, “Big Data: Mitigating Financial Crime Risk,” Miren Aparicio….Read the Publication (PDF)

Public Data Is More Important Than Ever–And Now It’s Easier To Find


Meg Miller at Co.Design: “Public data, in theory, is meant to be accessible to everyone. But in practice, even finding it can be near impossible, to say nothing of figuring out what to do with it once you do. Government data websites are often clunky and outdated, and some data is still trapped on physical media–like CDs or individual hard drives.

Tens of thousands of these CDs and hard drives, full of data on topics from Arkansas amusement parks to fire incident reporting, have arrived at the doorstep of the New York-based start-up Enigma over the past four years. The company has obtained thousands upon thousands more datasets by way of Freedom of Information Act (FOIA) requests. Enigma specializes in open data: gathering it, curating it, and analyzing it for insights into a client’s industry, for example, or for public service initiatives.

Enigma also shares its 100,000 datasets with the world through an online platform called Public—the broadest collection of public data that is open and searchable by everyone. Public has been around since Enigma launched in 2013, but today the company is introducing a redesigned version of the site that’s fresher and more user-friendly, with easier navigation and additional features that allow users to drill further down into the data.

But while the first iteration of Public was mostly concerned with making Enigma’s enormous trove of data—which it was already gathering and reformating for client work—accessible to the public, the new site focuses more on linking that data in new ways. For journalists, researchers, and data scientists, the tool will offer more sophisticated ways of making sense of the data that they have access to through Enigma….

…the new homepage also curates featured datasets and collections to enforce a sense of discoverability. For example, an Enigma-curated collection of U.S. sanctions data from the U.S. Treasury Department’s Office of Foreign Assets Control (OFAC) shows data on the restrictions on entities or individuals that American companies can and can’t do business with in an effort to achieve specific national security or foreign policy objectives. A new round of sanctions against Russia have been in the news lately as an effort by President Trump to loosen restrictions on blacklisted businesses and individuals in Russia was overruled by the Senate last week. Enigma’s curated data selection on U.S. sanctions could help journalists contextualize recent events with data that shows changes in sanctions lists over time by presidential administration, for instance–or they could compare the U.S. sanctions list to the European Union’s….(More).

Regulation of Big Data: Perspectives on Strategy, Policy, Law and Privacy


Paper by Pompeu CasanovasLouis de KokerDanuta Mendelson and David Watts: “…presents four complementary perspectives stemming from governance, law, ethics, and computer science. Big, Linked, and Open Data constitute complex phenomena whose economic and political dimensions require a plurality of instruments to enhance and protect citizens’ rights. Some conclusions are offered in the end to foster a more general discussion.

This article contends that the effective regulation of Big Data requires a combination of legal tools and other instruments of a semantic and algorithmic nature. It commences with a brief discussion of the concept of Big Data and views expressed by Australian and UK participants in a study of Big Data use in a law enforcement and national security perspective. The second part of the article highlights the UN’s Special Rapporteur on the Right to Privacy interest in the themes and the focus of their new program on Big Data. UK law reforms regarding authorisation of warrants for the exercise of bulk data powers is discussed in the third part. Reflecting on these developments, the paper closes with an exploration of the complex relationship between law and Big Data and the implications for regulation and governance of Big Data….(More)”.

Computational Propaganda Worldwide


Executive Summary: “The Computational Propaganda Research Project at the Oxford Internet Institute, University of Oxford, has researched the use of social media for public opinion manipulation. The team involved 12 researchers across nine countries who, altogether, interviewed 65 experts, analyzed tens of millions posts on seven different social media platforms during scores of elections, political crises, and national security incidents. Each case study analyzes qualitative, quantitative, and computational evidence collected between 2015 and 2017 from Brazil, Canada, China, Germany, Poland, Taiwan, Russia, Ukraine, and the United States.

Computational propaganda is the use of algorithms, automation, and human curation to purposefully distribute misleading information over social media networks. We find several distinct global trends in computational propaganda. •

  • Social media are significant platforms for political engagement and crucial channels for disseminating news content. Social media platforms are the primary media over which young people develop their political identities.
    • In some countries this is because some companies, such as Facebook, are effectively monopoly platforms for public life. o In several democracies the majority of voters use social media to share political news and information, especially during elections.
    • In countries where only small proportions of the public have regular access to social media, such platforms are still fundamental infrastructure for political conversation among the journalists, civil society leaders, and political elites.
  • Social media are actively used as a tool for public opinion manipulation, though in diverse ways and on different topics. o In authoritarian countries, social media platforms are a primary means of social control. This is especially true during political and security crises. o In democracies, social media are actively used for computational propaganda either through broad efforts at opinion manipulation or targeted experiments on particular segments of the public.
  • In every country we found civil society groups trying, but struggling, to protect themselves and respond to active misinformation campaigns….(More)”.

The lost genius of the Post Office


Kevin R. Kosar at Politico: “…When Americans think about the most innovative agency in the government, they think about the Pentagon or NASA. But throughout much of its history, that title could just as easily have fallen to the Post Office, which was a hotbed of new, interesting, sometimes crazy ideas as it sought to accomplish a seemingly simple task: deliver mail quickly and cheaply. The Post Office experimented with everything from stagecoaches to airplanes—even pondered sending mail cross-country on a missile. For decades, the agency integrated new technologies and adapted to changing environments, underpinning its ability to deliver billions of pieces of mail every year, from the beaches of Miami to the banks of Alaska, for just cents per letter.

We think a lot about how innovation arises, but not enough about how it gets quashed. And the USPS is a great example of both. Today, what was once a locus of innovation has become a tired example of bureaucratic inertia and government mismanagement. The agency always faced an uphill battle, with frequent political interference from Congress, and the ubiquity of the internet has changed how Americans communicate in unforeseeable ways. But its descent into its current state was not foretold. A series of misguided rules and laws have clipped the Post Office’s wings, turning one of the great inventors of the government into yet another clunky bureaucracy. As a new administration once again takes up the cause of “reinventing government,” it’s worth considering what made the Post Office one of the most inventive parts of the nation’s infrastructure—and what factors have dragged it down.

IN A SENSE, innovation was baked into the Post Office from the beginning. America’s national postal service precedes the founding: It was born in July 1775, a year before the Declaration of Independence was ratified. During the American Revolution, the U.S. postal system’s duty was to deliver communications between Congress and the military commanders fighting the British. And for the first postmaster general, Congress appointed an inveterate tinkerer, Benjamin Franklin. He rigged up a system of contractors to haul mail by horse and on foot. It worked….

OVERSHADOWING ALL THE invention, however, was the creeping sclerosis of the Post Office as an institution. As a monopoly, it was insulated from competitive pressures, allowing inefficiency to creep into its operations and management. Worse, political interests had sunk deep, with Congress setting postage rates too low and too frequently trying to dictate the location of post offices and mail-sorting facilities.

Political pressures had been a challenge for the department from the start. President George Washington criticized Postmaster General Ebenezer Hazard when he tried to save the department money by switching mail carriers from stagecoaches to lone horse-riders. Private companies, eager to sell products or services to the department, lobbied Congress for postal contracts. Lawmakers inserted hacks into postal jobs. Everybody wanted something from the Post Office Department, and Congress proved all too happy to satisfy these political pressures….

At the same time, technology rapidly was catching up to the Post Office. The first threat was actually a miss: Although the electronic fax arrived in the early 1970s, it did not eat into the USPS’ business. So when cellular-phone technology arrived in the late 1980s and the internet erupted in the mid-1990s, USPS officials mostly shrugged. Annual revenues climbed, and the USPS’ employee cohort rose to nearly 800,000 before the end of the 20th century….

Private-sector companies may soon eat even more of the Postal Service’s lunch, or a good portion of it. Amazon is building a delivery network of its own, with lockers instead of post office boxes, and experimenting with drones. Uber also has nosed into the delivery business, and other companies are experimenting with autonomous delivery vehicles and robots….

The agency continues to be led by longtime postal people rather than those who move fluidly through the increasingly digitized world; Congress also has not been much help. The postal reform bill currently moving before Congress might sound like the right idea, but its fixes are superficial: It would force the USPS to create an “innovation officer,” an official with little authority to bring about genuine change at the agency, and wouldn’t do much to dislodge the entrenched political interests from the basic structure of the USPS. Which means the Postal Service—once one of the most impressive and fast-moving information networks ever devised—may end up as a lesson in how not to meet the future….(More)”

Handbook of Cyber-Development, Cyber-Democracy, and Cyber-Defense


Living Reference Work” edited by Elias G. CarayannisDavid F. J. Campbell, and Marios Panagiotis Efthymiopoulos: “This volume covers a wide spectrum of issues relating to economic and political development enabled by information and communication technology (ICT). Showcasing contributions from researchers, industry leaders and policymakers, this Handbook provides a comprehensive overview of the challenges and opportunities created by technological innovations that are profoundly affecting the dynamics of economic growth, promotion of democratic principles, and the protection of individual, national, and regional rights. Of particular interest is the influence of ICT on the generation and dissemination of knowledge, which, in turn, empowers citizens and accelerates change across all strata of society. Each essay features literature reviews and key references; definition of critical terms and concepts, case examples; implications for practice, policy and theory; and discussion of future directions. Representing such fields as management, political science, economics, law, psychology and education, the authors cover such timely topics as health care, energy and environmental policy, banking and finance, disaster recovery, investment in research and development, homeland security and diplomacy in the context of ICT and its economic, political and social impact…(More)”