Citizen-Generated Data and Governments: Towards a Collaborative Model


Civicus: “…we’re very happy today to launch “Citizen-Generated Data and Governments: Towards a Collaborative Model”.

This piece explores the idea that governments could host and publish citizen-generated data (CGD) themselves, and whether this could mean that data is applied more widely and in a more sustainable way. It was inspired by a recent meeting in Buenos Aires with Argentine civil society organizations and government representatives, hosted by the City of Buenos Aires Innovation and Open Government Lab (Laboratorio de innovación y Gobierno Abierto de la Ciudad de Buenos Aires).

Screen Shot 2015-10-26 at 20.58.06

The meeting was organized to explore how people within government think about citizen-generated data, and discuss what would be needed for them to consider it as a valid method of data generation. One of the most novel and exciting ideas that surfaced was the potential for government open data portals, such as that managed by the Buenos Aires Innovation Lab, to host and publish CGD.

We wrote this report to explore this issue further, looking at existing models of data collaboration and outlining our first thoughts on the benefits and obstacles this kind of model might face. We welcome feedback from those with deeper expertise into different aspects of citizen-generated data, and look forward to refining these thoughts in the future together with the broader community…(More)”

How open company data was used to uncover the powerful elite benefiting from Myanmar’s multi-billion dollar jade industry


OpenCorporates: “Today, we’re pleased to release a white paper on how OpenCorporates data was used to uncover the powerful elite benefiting from Myanmar’s multi-billion dollar jade industry, in a ground-breaking report from Global Witness. This investigation is an important case study on how open company data and identifiers are critical tool to uncover corruption and the links between companies and the real people benefitting from it.

This white paper shows how not only was it critical that OpenCorporates had this information (much of the information was removed from the official register during the investigation), but that the fact that it was machine-readable data, available via an API (data service), and programmatically combinable with other data was essential to discover the hidden connections between the key actors and the jade industry. Global Witness was able to analyse this data with the help of Open Knowledge.

In this white paper, we make recommendations about the collection and publishing of statutory company information as open data to facilitate the creation of a hostile environment for corruption by providing a rigorous framework for public scrutiny and due diligence.

You can find the white paper here or read it on Medium.”

How Big Data Could Open The Financial System For Millions Of People


But that’s changing as the poor start leaving data trails on the Internet and on their cell phones. Now that data can be mined for what it says about someone’s creditworthiness, likeliness to repay, and all that hardcore stuff lenders want to know.

“Every time these individuals make a phone call, send a text, browse the Internet, engage social media networks, or top up their prepaid cards, they deepen the digital footprints they are leaving behind,” says a new report from the Omidyar Network. “These digital footprints are helping to spark a new kind of revolution in lending.”

The report, called “Big Data, Small Credit,” looks at the potential to expand credit access by analyzing mobile and smartphone usage data, utility records, Internet browsing patters and social media behavior….

“In the last few years, a cluster of fast-emerging and innovative firms has begun to use highly predictive technologies and algorithms to interrogate and generate insights from these footprints,” the report says.

“Though these are early days, there is enough to suggest that hundreds of millions of mass-market consumers may not have to remain ‘invisible’ to formal, unsecured credit for much longer.”…(More)

Using data to improve the environment


Sir Philip Dilley at the UK Environment Agency: “We live in a data rich world. As an engineer I know the power of data in the design and implementation of new urban spaces, iconic buildings and the infrastructure on which we all depend.

Data also is a powerful force in helping us to protect the environment and it can be mined from a variety of sources.

Since the Victorian times naturalists have collected data on the natural world. At the Environment Agency we continue to use local enthusiasts to track rainfall, which we use to feed into and support local projections of flood risk. But the advent of computing power and the Government’s move to open data means we can now all use data in a new and exciting way. The result is a more informed approach to improving the environment and protecting people.

For the last 17 years the Environment Agency has used lasers in planes to map and scan the English landscape from above to help us carry out work such as flood modelling (data now available for everyone to use). The same information has been used to track changing coastal habitats and to help us use the power of nature to adapt to a changing climate.

We’ve used our LIDAR height data together with aerial photography to inform the location and design of major coastal realignment sites. The award-winning Medmerry project, which created 183 hectares of new coastal habitat and protects 348 properties from flooding, was based on this data-led approach.

Those who live near rivers or who use them for sport and recreation know the importance of getting up to date information on river flows. We already provide online services to the public so they can see current warnings and river levels information, but opening our data means everyone can get bespoke information through one postcode or location search.

We are not the only ones seeing the power of environmental data. Data entrepreneurs know how to get accurate and easily accessible information to the public. And that means that we can all make informed choices.FloodAlerts provides a graphical representation of flood warnings and gives localised updates every 15 minutes and Flood Risk Finder app provides flood risk profiles on any property in England, both using data made available for public use by the Environment Agency.

Our bathing waters data directs those who like to swim, surf or paddle with vital information on water quality. The Safer Seas Service app alerts water users when water quality is reduced at beaches and our bathing water data is also used by the Marine Conservation Society’s Good Beach Guide….(More)”

What is Citizensourcing?


Citizensourcing is the crowdsourcing practice applied by governments with the goal of tapping into the collective intelligence of the citizens. Through citizensourcing, governments can collect ideas, suggestions and opinions from their citizens — thereby creating a permanent feedback loop of communication.

Cities are a powerhouse of collective intelligence. Thanks to modern technologies, time has come to unlock the wisdom of the crowd. Tweet: Cities are powerhouses of collective intelligence - time to unlock them. via @citizenlabco http://ctt.ec/7e6Q2+

Yesterday

The current means of engaging citizens in public policy are in place since the 18th century: town hall meetings, in-person visits, phone calls or bureaucratic forms that allowed you to submit an idea. All of those ways of engagement are time-consuming, ineffective and expensive.

Great ideas and valuable feedback get lost, because those forms of engagement take too much effort for both citizens and cities. And next to that, communication happens in private between city government and citizens. Citizens cannot communicate with each other about how they want to improve their city.

Today

Advances in technology have restructured the way societies are organised; we’re living a digital age in which citizens are connected over networks. This creates unseen opportunities for cities to get closer to their citizens and serve them better. In the last years, we’ve seen several cities trying to build a strong online presence on social media channels.

Yet, they have discovered that communicating with their citizens over Twitter and Facebook is far from optimal. Messages get lost in the information overload that characterises those platforms, resulting in a lack of structured communication.

Tomorrow

Imagine that your town hall meetings could be held online… but then 24/7, accessible from every possible device. Citizensourcing on a dedicated platform is an inexpensive way for cities to get valuable input in the form of ideas, feedback and opinions from their citizens.

Whereas only a very small proportion of citizens engage in the time-consuming offline participation, an online platform allows you to multiply your reach by tenfolds. You reach an audience of citizens that you couldn’t reach before, which makes an online platform a well-needed complement for the already existing offline channels in every city.

When citizens can share their ideas in an easy and fun way and get rewarded for their valuable input, that’s when the wisdom of the crowd gets truly unlocked.

The most direct benefit for cities is clear: crowdsourcing new urban ideas drives superior innovations. At least as important as the fact that you offer a new channel for proposals, is that engagement leads to a better understanding of the different needs citizens have…..

There are several early success stories that show the gigantic potential though:

  • The Colombian city Medellín has its own crowdsourcing platform MiMedellín on which citizens share their urban solutions for problems the city faces. It turned out to be a big success: having collected more than 2,300 (!) posted ideas, the government is already developing policies with help from the creativity of citizens.
  • In the Icelandic capital, Reykjavik, the city council succeeded in having their citizensourcing website Better Reykjavik used by over 60% of the citizens. Since Reykjavik implemented their city platform, they have spent €1.9 million on developing more than 200 projectsbased on ideas from citizens..
  • Paris held a participatory budgeting process, called ‘Madame Mayor, I have an idea’, that brought forward wonderful proejcts. To name one, after having received well over 20,000 votes, the city government announced to invest €2 million in vertical garden projects. Other popular ideas included gardens in schools, neighbourhood recycling centers and co-working spaces for students and entrepreneurs….(More)”

Toward a manifesto for the ‘public understanding of big data’


Mike Michael and Deborah Lupton in Public Understanding of Science: “….we sketch a ‘manifesto’ for the ‘public understanding of big data’. On the one hand, this entails such public understanding of science and public engagement with science and technology–tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data’s trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article….(More)”

Privacy Bridges: EU and US Privacy Experts in Search of Transatlantic Privacy Solutions


IVIR and MIT: “The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the First, Fourth and Fifth Amendment as well as federal and state statute, consumer protection law and common law. The ultimate goal of effective privacy protection is shared. However, current friction between the two legal systems poses challenges to realizing privacy and the free flow of information across the Atlantic. Recent expansion of online surveillance practices underline these challenges.

Over nine months, the group prepared a consensus report outlining a menu of privacy “bridges” that can be built to bring the European Union and the United States closer together. The efforts are aimed at providing a framework of practical options that advance strong, globally-accepted privacy values in a manner that respects the substantive and procedural differences between the two jurisdictions….

(More)”

Open data, open mind: Why you should share your company data with the world


Mark Samuels at ZDnet: “If information really is the lifeblood of modern organisations, then CIOs could create huge benefits from opening their data to new, creative pairs of eyes. Research from consultant McKinsey suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data: that is, taking previously proprietary data (often starting with public sector data) and opening up access.

So, should your business consider giving outsiders access to insider information? ZDNet speaks to three experts.

More viewpoints can mean better results

Former Tullow Oil CIO Andrew Marks says debates about the potential openness of data in a private sector context are likely to be dominated by one major concern: information security.

“It’s a perfectly reasonable debate until people start thinking about privacy,” he says. “Putting information at risk, both in terms of customer data and competitive advantage, will be a risk too far for many senior executives.”

But what if CIOs could allay c-suite peers’ concerns and create a new opportunity? Marks points to the Goldcorp Challenge, which saw the mining specialist share its proprietary geological data to allow outside experts pick likely spots for mining. The challenge, which included prize money of $575,000 helped identify more than 110 sites, 50 per cent of which were previously unknown to the company. The value of gold found through the competition exceeded $6bn. Marks wonders whether other firms could take similarly brave steps.
“There is a period of time when information is very sensitive,” he says. “Once the value of data starts to become finite, then it might be beneficial for businesses to open the doors and to let outsiders play with the information. That approach, in terms of gamification, might lead to the creation of new ideas and innovations.”…

Marks says these projects help prove that, when it comes to data, more is likely to mean different – and possibly better – results. “Whether using big data algorithms or the human touch, the more viewpoints you bring together, the more you can increases chances of success and reduce risk,” he says.

“There is, therefore, always likely to be value in seeking an alternative perspective. Opening access to data means your firm is going to get more ideas, but CIOs and other senior executives need to think very carefully about what such openness means for the business, and the potential benefits.”….Some leading firms are already taking steps towards openness. Take Christina Scott, chief product and information officer at the Financial Times, who says the media organisation has used data analysts to help push the benefits of information-led insight across the business.

Her team has democratised data in order to make sure that all parts of the organisation can get the information they need to complete their day-to-day jobs. Scott says the approach is best viewed as an open data strategy, but within the safe confines of the existing enterprise firewall. While the tactic is internally focused currently, Scott says the FT is keen to find ways to make the most of external talent in the future.

“We’re starting to consider how we might open data beyond the organisation, too,” she says. “Our data holds a lot of value and insight, including across the metadata we’ve created. So it would be great to think about how we could use that information in a more open way.” Part of the FT’s business includes trade-focused magazines. Scott says opening the data could provide new insight to its B2B customers across a range of sectors. In fact, the firm has already dabbled at a smaller scale.

“We’ve run hackathons, where we’ve exposed our APIs and given people the chance to come up with some new ideas,” she says. “But I don’t think we’ve done as much work on open data as we could. And I think that’s the direction in which better organisations are moving. They recognise that not all innovation is going to happen within the company.”…

CIO Omid Shiraji is another IT expert who recognises that there is a general move towards a more open society. Any executive who expects to work within a tightly defined enterprise firewall is living in cloud cuckoo land, he argues. More to the point, they will miss out on big advantages.
“If you can expose your sources to a range of developers, you can start to benefit from massive innovation,” he says. “You can get really big benefits from opening your data to external experts who can focus on areas that you don’t have the capability to develop internally.”

Many IT leaders would like to open data to outside experts, suggests Shiraji. For CIOs who are keen to expose their sources, he suggests letting small-scale developers take a close look at in-house data silos in an attempt to discover what relationships might exist and what advantages could accrue….(More)”

Introducing Government as a Platform


Peter Williams, Jan Gravesen and Trinette Brownhill in Government Executive: “Governments around the world are facing competitive pressures and expectations from their constituents that are prompting them to innovate and dissolve age-old structures. Many governments have introduced a digital strategy in which at least one of the goals is aimed at bringing their organizations closer to citizens and businesses.

To achieve this, ideally IT and data in government would not be constrained by the different functional towers that make up the organization, as is often the case. They would not be constrained by complex, monolithic application design philosophies and lengthy implementation cycles, nor would development be constrained by the assumption that all activity has to be executed by the government itself.

Instead, applications would be created rapidly and cheaply, and modules would be shared as reusable blocks of code and integrated data. It would be relatively straightforward to integrate data from multiple departments to enable a focus on the complex needs of, say, a single parent who is diabetic and a student. Delivery would be facilitated in the manner best required, or preferred, by the citizen. Third parties would also be able to access these modules of code and data to build higher value government services that multiple agencies would then buy into. The code would run on a cloud infrastructure that maximizes the efficiency in which processing resources are used.

GaaP an organized set of ideas and principles that allows organizations to approach these ideals. It allows governments to institute more efficient sharing of IT resources as well as unlock data and functionality via application programming interfaces to allow third parties to build higher value citizen services. In doing so, security plays a crucial role protecting the privacy of constituents and enterprise assets.

We see increasingly well-established examples of GaaP services in many parts of the world. The notion has significantly influenced strategic thinking in the UK, Australia, Denmark, Canada and Singapore. In particular, it has evolved in a deliberate way in the UK’s Government Data Services, building on the Blairite notion of “joined up government”; in Australia’s e-government strategy and its myGov program; and as a significant influencer in Singapore’s entire approach to building its “smarter nation” infrastructure.

Collaborative Government

GaaP assumes a transformational shift in efficiency, effectiveness and transparency, in which agencies move toward a collaborative government and away from today’s siloed approach. That collaboration may be among agencies, but also with other entities (nongovernmental organizations, the private sector, citizens, etc.).

GaaP’s focus on collaboration enables public agencies to move away from their traditional towered approach to IT and increasingly make use of shared and composable services offered by a common – usually a virtualized, cloud-enabled – platform. This leads to more efficient use of development resources, platforms and IT support. We are seeing examples of this already with a group of townships in New York state and also with two large Spanish cities that are embarking on this approach.

While efficient resource and service sharing is central to the idea of GaaP, it is not sufficient. The idea is that GaaP must allow app developers, irrespective of whether they are citizens, private organizations or other public agencies, to develop new value-added services using published government data and APIs. In this sense, the platform becomes a connecting layer between public agencies’ systems and data on the one hand, and private citizens, organizations and other public agencies on the other.

In its most fundamental form, GaaP is able to:

  • Consume data and government services from existing departmental systems.
  • Consume syndicated services from platform-as-a-service or software-as-a-service providers in the public marketplace.
  • Securely unlock these data and services and allow third parties –citizens, private organizations or other agencies – to combine services and data into higher-order services or more citizen-centric or business-centric services.

It is the openness, the secure interoperability, and the ability to compose new services on the basis of existing services and data that define the nature of the platform.

The Challenges

At one time, the challenge of creating a GaaP structure would have been technology: Today, it is governance….(More)”

Big data problems we face today can be traced to the social ordering practices of the 19th century.


Hamish Robertson and Joanne Travaglia in LSE’s The Impact Blog: “This is not the first ‘big data’ era but the second. The first was the explosion in data collection that occurred from the early 19th century – Hacking’s ‘avalanche of numbers’, precisely situated between 1820 and 1840. This was an analogue big data era, different to our current digital one but characterized by some very similar problems and concerns. Contemporary problems of data analysis and control include a variety of accepted factors that make them ‘big’ and these generally include size, complexity and technology issues. We also suggest that digitisation is a central process in this second big data era, one that seems obvious but which has also appears to have reached a new threshold. Until a decade or so ago ‘big data’ looked just like a digital version of conventional analogue records and systems. Ones whose management had become normalised through statistical and mathematical analysis. Now however we see a level of concern and anxiety, similar to the concerns that were faced in the first big data era.

This situation brings with it a socio-political dimension of interest to us, one in which our understanding of people and our actions on individuals, groups and populations are deeply implicated. The collection of social data had a purpose – understanding and controlling the population in a time of significant social change. To achieve this, new kinds of information and new methods for generating knowledge were required. Many ideas, concepts and categories developed during that first data revolution remain intact today, some uncritically accepted more now than when they were first developed. In this piece we draw out some connections between these two data ‘revolutions’ and the implications for the politics of information in contemporary society. It is clear that many of the problems in this first big data age and, more specifically, their solutions persist down to the present big data era….Our question then is how do we go about re-writing the ideological inheritance of that first data revolution? Can we or will we unpack the ideological sequelae of that past revolution during this present one? The initial indicators are not good in that there is a pervasive assumption in this broad interdisciplinary field that reductive categories are both necessary and natural. Our social ordering practices have influenced our social epistemology. We run the risk in the social sciences of perpetuating the ideological victories of the first data revolution as we progress through the second. The need for critical analysis grows apace not just with the production of each new technique or technology but with the uncritical acceptance of the concepts, categories and assumptions that emerged from that first data revolution. That first data revolution proved to be a successful anti-revolutionary response to the numerous threats to social order posed by the incredible changes of the nineteenth century, rather than the Enlightenment emancipation that was promised. (More)”

This is part of a wider series on the Politics of Data. For more on this topic, also see Mark Carrigan’sPhilosophy of Data Science interview series and the Discover Society special issue on the Politics of Data (Science).