Case Studies of Government Use of Big Data in Latin America: Brazil and Mexico


Chapter by Roberto da Mota Ueti, Daniela Fernandez Espinosa, Laura Rafferty, Patrick C. K. Hung in Big Data Applications and Use Cases: “Big Data is changing our world with masses of information stored in huge servers spread across the planet. This new technology is changing not only companies but governments as well. Mexico and Brazil, two of the most influential countries in Latin America, are entering a new era and as a result, facing challenges in all aspects of public policy. Using Big Data, the Brazilian Government is trying to decrease spending and use public money better by grouping public information with stored information on citizens in public services. With new reforms in education, finances and telecommunications, the Mexican Government is taking on a bigger role in efforts to channel the country’s economic policy into an improvement of the quality of life of their habitants. It is known that technology is an important part for sub-developed countries, who are trying to make a difference in certain contexts such as reducing inequality or regulating the good usage of economic resources. The good use of Big Data, a new technology that is in charge of managing a big quantity of information, can be crucial for the Mexican Government to reach the goals that have been set in the past under Peña Nieto’s administration. This article focuses on how the Brazilian and Mexican Governments are managing the emerging technologies of Big Data and how it includes them in social and industrial projects to enhance the growth of their economies. The article also discusses the benefits of these uses of Big Data and the possible problems that occur related to security and privacy of information….(More)’

Regulatory Transformations: An Introduction


Chapter by Bettina Lange and Fiona Haines in the book Regulatory Transformations: “Regulation is no longer the prerogative of either states or markets. Increasingly citizens in association with businesses catalyse regulation which marks the rise of a social sphere in regulation. Around the world, in San Francisco, Melbourne, Munich and Mexico City, citizens have sought to transform how and to what end economic transactions are conducted. For instance, ‘carrot mob’ initiatives use positive economic incentives, not provided by a state legal system, but by a collective of civil society actors in order to change business behaviour. In contrast to ‘negative’ consumer boycotts, ‘carrotmob’ events use ‘buycotts’. They harness competition between businesses as the lever for changing how and for what purpose business transactions are conducted. Through new social media ‘carrotmobs’ mobilize groups of citizens to purchase goods at a particular time in a specific shop. The business that promises to spend the greatest percentage of its takings on, for instance, environmental improvements, such as switching to a supplier of renewable energy, will be selected for an organized shopping spree and financially benefit from the extra income it receives from the ‘carrot mob’ event.’Carrot mob’ campaigns chime with other fundamental challenges to conventional economic activity, such as the shared use of consumer goods through citizens collective consumption which questions traditional conceptions of private property….(More; Other Chapters)”

 

Offshore Leaks Database


“This ICIJ database contains information on almost 320,000 offshore entities that are part of the Panama Papers and the Offshore Leaks investigations. The data covers nearly 40 years up to the end of 2015 and links to people and companies in more than 200 countries and territories.

DISCLAIMER

There are legitimate uses for offshore companies and trusts. We do not intend to suggest or imply that any persons, companies or other entities included in the ICIJ Offshore Leaks Database have broken the law or otherwise acted improperly. Many people and entities have the same or similar names. We suggest you confirm the identities of any individuals or entities located in the database based on addresses or other identifiable information. If you find an error in the database please get in touch with us….(More)”

Citizen scientists aid Ecuador earthquake relief


Mark Zastrow at Nature: “After a magnitude-7.8 earthquake struck Ecuador’s Pacific coast on 16 April, a new ally joined the international relief effort: a citizen-science network called Zooniverse.

On 25 April, Zooniverse launched a website that asks volunteers to analyse rapidly-snapped satellite imagery of the disaster, which led to more than 650 reported deaths and 16,000 injuries. The aim is to help relief workers on the ground to find the most heavily damaged regions and identify which roads are passable.

Several crisis-mapping programmes with thousands of volunteers already exist — but it can take days to train satellites on the damaged region and to transmit data to humanitarian organizations, and results have not always proven useful. The Ecuador quake marked the first live public test for an effort dubbed the Planetary Response Network (PRN), which promises to be both more nimble than previous efforts, and to use more rigorous machine-learning algorithms to evaluate the quality of crowd-sourced analyses.

The network relies on imagery from the satellite company Planet Labs in San Francisco, California, which uses an array of shoebox-sized satellites to map the planet. In order to speed up the crowd-sourced process, it uses the Zooniverse platform to distribute the tasks of spotting features in satellite images. Machine-learning algorithms employed by a team at the University of Oxford, UK, then classify the reliability of each volunteer’s analysis and weight their contributions accordingly.

Rapid-fire data

Within 2 hours of the Ecuador test project going live with a first set of 1,300 images, each photo had been checked at least 20 times. “It was one of the fastest responses I’ve seen,” says Brooke Simmons, an astronomer at the University of California, San Diego, who leads the image processing. Steven Reece, who heads the Oxford team’s machine-learning effort, says that results — a “heat map” of damage with possible road blockages — were ready in another two hours.

In all, more than 2,800 Zooniverse users contributed to analysing roughly 25,000 square kilometres of imagery centred around the coastal cities of Pedernales and Bahia de Caraquez. That is where the London-based relief organization Rescue Global — which requested the analysis the day after the earthquake — currently has relief teams on the ground, including search dogs and medical units….(More)”

Supply and demand of open data in Mexico: A diagnostic report on the government’s new open data portal


Report by Juan Ortiz Freuler: “Following a promising and already well-established trend, in February 2014 the Office of the President of Mexico launched its open data portal (datos.gob.mx). This diagnostic –carried out between July and September of 2015- is designed to brief international donors and stakeholders such as members of the Open Government Partnership Steering Committee, provides the reader with contextual information to understand the state of supply and demand for open data from the portal, and the specific challenges the mexican government is facing in its quest to implement the policy. The insights offered through data processing and interviews with key stakeholders indicate the need to promote: i) A sense of ownership of datos.gob.mx by the user community, but particularly by the officials in charge of implementing the policy within each government unit; ii) The development of tools and mechanisms to increase the quality of the data provided through the portal; and iii) Civic hacking of the portal to promote innovation, and a sense of appropriation that would increase the policy’s long-term resilience to partisan and leadership change….(More)”

See also Underlying data: http://bit.ly/dataMXEng1Spanish here: http://bit.ly/DataMxCastellUnderlying data:http://bit.ly/dataMX2

UN-Habitat Urban Data Portal


Data Driven Journalism:UN-Habitat has launched a new web portal featuring a wealth of city data based on its repository of research on urban trends.

Launched during the 25th Governing Council, the Urban Data Portal allows users to explore data from 741 cities in 220 countries, and compare these for 103 indicators such as slum prevalence and city prosperity.

compare.PNG
Image: A comparison of share in national urban population and average annual rate of urban population change for San Salvador, El Salvador, and Asuncion, Paraguay.

The urban indicators data available are analyzed, compiled and published by UN-Habitat’s Global Urban Observatory, which supports governments, local authorities and civil society organizations to develop urban indicators, data and statistics.

Leveraging GIS technology, the Observatory collects data by taking aerial photographs, zooming into particular areas, and then sending in survey teams to answer any remaining questions about the area’s urban development.

The Portal also contains data collected by national statistics authorities, via household surveys and censuses, with analysis conducted by leading urbanists in UN-HABITAT’s State of the World’s Cities and the Global Report on Human Settlements report series.

For the first time, these datasets are available for use under an open licence agreement, and can be downloaded in straightforward database formats like CSV and JSON….(More)

Mexico City is crowdsourcing its new constitution using Change.org in a democracy experiment


Ana Campoy at Quartz: “Mexico City just launched a massive experiment in digital democracy. It is asking its nearly 9 million residents to help draft a new constitution through social media. The crowdsourcing exercise is unprecedented in Mexico—and pretty much everywhere else.

as locals are known, can petition for issues to be included in the constitution through Change.org (link inSpanish), and make their case in person if they gather more than 10,000 signatures. They can also annotate proposals by the constitution drafters via PubPub, an editing platform (Spanish) similar to GoogleDocs.

The idea, in the words of the mayor, Miguel Angel Mancera, is to“bestow the constitution project (Spanish) with a democratic,progressive, inclusive, civic and plural character.”

There’s a big catch, however. The constitutional assembly—the body that has the final word on the new city’s basic law—is under no obligation to consider any of the citizen input. And then there are the practical difficulties of collecting and summarizing the myriad of views dispersed throughout one of the world’s largest cities.

That makes Mexico City’s public-consultation experiment a big test for the people’s digital power, one being watched around the world.Fittingly, the idea of crowdsourcing a constitution came about in response to an attempt to limit people power.

Fittingly, the idea of crowdsourcing a constitution came about in response to an attempt to limit people power.
For decades, city officials had fought to get out from under the thumb of the federal government, which had the final word on decisions such as who should be the city’s chief of police. This year, finally, they won a legal change that turns the Distrito Federal (federal district), similar to the US’s District of Columbia, into Ciudad de México (Mexico City), a more autonomous entity, more akin to a state. (Confusingly, it’s just part of the larger urban area also colloquially known as Mexico City, which spills into neighboring states.)

However, trying to retain some control, the Mexican congress decided that only 60% of the delegates to the city’s constitutional assembly would be elected by popular vote. The rest will be assigned by the president, congress, and Mancera, the mayor. Mancera is also the only one who can submit a draft constitution to the assembly.

Mancera’s response was to create a committee of some 30 citizens(Spanish), including politicians, human-rights advocates, journalists,and even a Paralympic gold medalist, to write his draft. He also calledfor the development of mechanisms to gather citizens’ “aspirations,values, and longing for freedom and justice” so they can beincorporated into the final document.

 The mechanisms, embedded in an online platform (Spanish) that offersvarious ways to weigh in, were launched at the end of March and willcollect inputs until September 1. The drafting group has until themiddle of that month to file its text with the assembly, which has toapprove the new constitution by the end of January.
 An experiment with few precedents

Mexico City didn’t have a lot of examples to draw on, since not a lot ofplaces have experience with crowdsourcing laws. In the US, a few locallawmakers have used Wiki pages and GitHub to draft bills, says MarilynBautista, a lecturer at Stanford Law School who has researched thepractice. Iceland—with a population some 27 times smaller than MexicoCity’s—famously had its citizens contribute to its constitution withinput from social media. The effort failed after the new constitution gotstuck in parliament.

In Mexico City, where many citizens already feel left out, the first bighurdle is to convince them it’s worth participating….

Then comes the task of making sense of the cacophony that will likelyemerge. Some of the input can be very easily organized—the results ofthe survey, for example, are being graphed in real time. But there could be thousands of documents and comments on the Change.org petitionsand the editing platform.

 Ideas are grouped into 18 topics, such as direct democracy,transparency and economic rights. They are prioritized based on theamount of support they’ve garnered and how relevant they are, saidBernardo Rivera, an adviser for the city. Drafters get a weekly deliveryof summarized citizen petitions….
An essay about human rights on the PubPub platform.(PubPub)

The most elaborate part of the system is PubPub, an open publishing platform similar to Google Docs, which is based on a project originally developed by MIT’s Media Lab. The drafters are supposed to post essays on how to address constitutional issues, and potentially, the constitution draft itself, once there is one. Only they—or whoever they authorize—will be able to reword the original document.

User comments and edits are recorded on a side panel, with links to the portion of text they refer to. Another screen records every change, so everyone can track which suggestions have made it into the text. Members of the public can also vote comments up or down, or post their own essays….(More).

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

The era of development mutants


Guilo Quaggiotto at Nesta: “If you were looking for the cutting edge of the development sector, where would you go these days? You would probably look at startups like Premise who have predicted food trends 25 days faster than national statistics in Brazil, or GiveDirectly who are pushing the boundaries on evidence – from RCTs to new ways of mapping poverty – to fast track the adoption of cash transfers.

Or perhaps you might draw your attention to PetaJakarta who are experimenting with new responses to crises by harnessing human sensor networks. You might be tempted to consider Airbnb’s Disaster Response programme as an indicator of an emerging alternative infrastructure for disaster response (and perhaps raising questions about the political economy of this all).

And could Bitnation’s Refugee Emergency programme in response to the European refugee crisis be the possible precursor of future solutions for transnational issues – among the development sector’s hardest challenges? Are the business models of One Acre Fund, which provides services for smallholder farmers, or Floodtags, which analyses citizen data during floods for water and disaster managers, an indicator of future pathways to scale – that elusive development unicorn?

If you want to look at the future of procuring solutions for the development sector, should you be looking at initiatives like Citymart, which works with municipalities across the world to rethink traditional procurement and unleash the expertise and innovation capabilities of their citizens? By the same token, projects like Pathogen Box, Poverty Stoplight or Patient Innovation point to a brave new world where lead-user innovation and harnessing ‘sticky’ local knowledge becomes the norm, rather than the exception. You would also be forgiven for thinking that social movements across the world are the place to look for signs of future mechanisms for harnessing collective intelligence – Kawal Pamilu’s “citizen experts” self-organising around the Indonesian elections in 2014 is a textbook case study in this department.

The list could go on and on: welcome to the era of development mutants. While established players in the development sector are engrossed in soul-searching and their fitness for purpose is being scrutinised from all quarters, a whole new set of players is emerging, unfettered by legacy and borrowing from a variety of different disciplines. They point to a potentially different future – indeed, many potentially different futures – for the sector…..

But what if we wanted to invert this paradigm? How could we move from denial to fruitful collaboration with the ‘edgeryders’ of the development sector and accelerate its transformation?

Adopting new programming principles

Based on our experience working with development organisations, we believe that partnering with the mutants involves two types of shifts for traditional players: at the programmatic and the operational level. At the programmatic level, our work on the ground led us to articulate the following emerging principles:

  1. Mapping what people have, not what they need: even though approaches like jugaad and positive deviance have been around for a long time, unfortunately the default starting point for many development projects is still mapping needs, not assets. Inverting this paradigm allows for potentially disruptive project design and partnerships to emerge. (Signs of the future: Patient Innovation, Edgeryders, Community Mirror, Premise)

  2. Getting ready for multiple futures: When distributed across an organisation and not limited to a centralised function, the discipline of scanning the horizon for emergent solutions that contradict the dominant paradigm can help move beyond the denial phase and develop new interfaces to collaborate with the mutants. Here the link between analysis (to understand not only what is probable, but also what is possible) and action is critical – otherwise this remains purely an academic exercise. (Signs of the future: OpenCare, Improstuctures, Seeds of Good Anthropocene, Museum of the Future)

  3. Running multiple parallel experiments: According to Dave Snowden, in order to intervene in a complex system “you need multiple parallel experiments and they should be based on different and competing theories/hypotheses”. Unfortunately, many development projects are still based on linear narratives and assumptions such as “if only we run an awareness raising campaign citizens will change their behaviour”. Turning linear narratives into hypotheses to be tested (without becoming religious on a specific approach) opens up the possibility to explore the solution landscape and collaborate with non-obvious partners that bring new approaches to the table. (Signs of the future: Chukua Hakua, GiveDirectly, Finnish PM’s Office of Experiments, Ideas42, Cognitive Edge)

  4. Embracing obliquity: A deep, granular understanding of local assets and dynamics along with system mapping (see point 5 below) and pairing behavioural experts with development practitioners can help identify entry points for exploring new types of intervention based on obliquity principles. Mutants are often faster in adopting this approach and partnering with them is a way to bypass organisational inertia and explore nonlinear interventions. (Signs of the future: Sardex, social prescriptions, forensic architecture)

  5. From projects to systems: development organisations genuinely interested in developing new partnerships need to make the shift from the project logic to system investments. This involves, among other things, shifting the focus from providing solutions to helping every actor in the system to develop a higher level of consciousness about the issues they are facing and to take better decisions over time. It also entails partnering with mutants to explore entirely new financial mechanisms. (Signs of the future: Lankelly Chase, Indonesia waste banks, Dark Matter Labs)

Adopting new interfaces for working with the mutants

Harvard Business School professor Carliss Baldwin argued that most bureaucracies these days have a ‘non-contractible’ problem: they don’t know where smart people are, or how to evaluate how good they are. Most importantly, most smart people don’t want to work for them because they find them either too callous, unrewarding or slow (or a combination of all of these)….(More)”

What Should We Do About Big Data Leaks?


Paul Ford at the New Republic: “I have a great fondness for government data, and the government has a great fondness for making more of it. Federal elections financial data, for example, with every contribution identified, connected to a name and address. Or the results of the census. I don’t know if you’ve ever had the experience of downloading census data but it’s pretty exciting. You can hold America on your hard drive! Meditate on the miracles of zip codes, the way the country is held together and addressable by arbitrary sets of digits.

You can download whole books, in PDF format, about the foreign policy of the Reagan Administration as it related to Russia. Negotiations over which door the Soviet ambassador would use to enter a building. Gigabytes and gigabytes of pure joy for the ephemeralist. The government is the greatest creator of ephemera ever.

Consider the Financial Crisis Inquiry Commission, or FCIC, created in 2009 to figure out exactly how the global economic pooch was screwed. The FCIC has made so much data, and has done an admirable job (caveats noted below) of arranging it. So much stuff. There are reams of treasure on a single FCIC web site, hosted at Stanford Law School: Hundreds of MP3 files, for example, with interviews with Jamie Dimonof JPMorgan Chase and Lloyd Blankfein of Goldman Sachs. I am desperate to find  time to write some code that automatically extracts random audio snippets from each and puts them on top of a slow ambient drone with plenty of reverb, so that I can relax to the dulcet tones of the financial industry explaining away its failings. (There’s a Paul Krugman interview that I assume is more critical.)

The recordings are just the beginning. They’ve released so many documents, and with the documents, a finding aid that you can download in handy PDF format, which will tell you where to, well, find things, pointing to thousands of documents. That aid alone is 1,439 pages.

Look, it is excellent that this exists, in public, on the web. But it also presents a very contemporary problem: What is transparency in the age of massive database drops? The data is available, but locked in MP3s and PDFs and other documents; it’s not searchable in the way a web page is searchable, not easy to comment on or share.

Consider the WikiLeaks release of State Department cables. They were exhausting, there were so many of them, they were in all caps. Or the trove of data Edward Snowden gathered on aUSB drive, or Chelsea Manning on CD. And the Ashley Madison leak, spread across database files and logs of credit card receipts. The massive and sprawling Sony leak, complete with whole email inboxes. And with the just-released Panama Papers, we see two exciting new developments: First, the consortium of media organizations that managed the leak actually came together and collectively, well, branded the papers, down to a hashtag (#panamapapers), informational website, etc. Second, the size of the leak itself—2.5 terabytes!—become a talking point, even though that exact description of what was contained within those terabytes was harder to understand. This, said the consortia of journalists that notably did not include The New York Times, The Washington Post, etc., is the big one. Stay tuned. And we are. But the fact remains: These artifacts are not accessible to any but the most assiduous amateur conspiracist; they’re the domain of professionals with the time and money to deal with them. Who else could be bothered?

If you watched the movie Spotlight, you saw journalists at work, pawing through reams of documents, going through, essentially, phone books. I am an inveterate downloader of such things. I love what they represent. And I’m also comfortable with many-gigabyte corpora spread across web sites. I know how to fetch data, how to consolidate it, and how to search it. I share this skill set with many data journalists, and these capacities have, in some ways, become the sole province of the media. Organs of journalism are among the only remaining cultural institutions that can fund investigations of this size and tease the data apart, identifying linkages and thus constructing informational webs that can, with great effort, be turned into narratives, yielding something like what we call “a story” or “the truth.” 

Spotlight was set around 2001, and it features a lot of people looking at things on paper. The problem has changed greatly since then: The data is everywhere. The media has been forced into a new cultural role, that of the arbiter of the giant and semi-legal database. ProPublica, a nonprofit that does a great deal of data gathering and data journalism and then shares its findings with other media outlets, is one example; it funded a project called DocumentCloud with other media organizations that simplifies the process of searching through giant piles of PDFs (e.g., court records, or the results of Freedom of Information Act requests).

At some level the sheer boredom and drudgery of managing these large data leaks make them immune to casual interest; even the Ashley Madison leak, which I downloaded, was basically an opaque pile of data and really quite boring unless you had some motive to poke around.

If this is the age of the citizen journalist, or at least the citizen opinion columnist, it’s also the age of the data journalist, with the news media acting as product managers of data leaks, making the information usable, browsable, attractive. There is an uneasy partnership between leakers and the media, just as there is an uneasy partnership between the press and the government, which would like some credit for its efforts, thank you very much, and wouldn’t mind if you gave it some points for transparency while you’re at it.

Pause for a second. There’s a glut of data, but most of it comes to us in ugly formats. What would happen if the things released in the interest of transparency were released in actual transparent formats?…(More)”