Open government and citizen engagement: From theory to action

Camilo Romero Galeano at apolitical: “…According to the 2016 Corruption Perception Index analysing the behaviour of 178 countries, 69% of countries evaluated again raised the alarm about what has been referred to as “the cancer of the public service”.

The scandals of misappropriation of public funds, illicit enrichment of public officials, the slippery labyrinths of procurement and all kinds of practices that challenge ethics in the public service are daily news around the world.

Colombia and the department of Nariño suffer from the same problems. Bad practices of traditional politics and chiefdoms have ended up destroying the trust that citizens once had in political institutions. Corruption and its devastating effects always end up undermining people’s dignity.

With this as the current state of affairs, and in our capacity as a subnational government, we have designed hand in hand with the citizens of Nariño a new government program. It  is based on an approach to innovation called “New Government” that relies on three pillars: open government; social innovation; and collaborative economy.

The new program has been endorsed by more than 300,000 voters and subsequently concretised in our roadmap for the territory: “Nariño heart of the World”. The creation of this policy document brought together 31,700 participants and involved travelling around the 13 subregions that compose the 64 municipalities in Nariño.

In this way, citizen participation has become an essential tool in the fight against corruption.

Our open government strategy is called GANA — Gobierno Abierto de Nariño (in English, “Win — Open Government of Nariño”). The strategy takes a step forward in ensuring cabinet officials become transparent and publicly declare private assets. Citizens can now find out the financial conditions in which public officials begin and finish their administrative periods. Each one of us….(More)”

Can tracking people through phone-call data improve lives?

Amy Maxmen in Nature: “After an earthquake tore through Haiti in 2010, killing more than 100,000 people, aid agencies spread across the country to work out where the survivors had fled. But Linus Bengtsson, a graduate student studying global health at the Karolinska Institute in Stockholm, thought he could answer the question from afar. Many Haitians would be using their mobile phones, he reasoned, and those calls would pass through phone towers, which could allow researchers to approximate people’s locations. Bengtsson persuaded Digicel, the biggest phone company in Haiti, to share data from millions of call records from before and after the quake. Digicel replaced the names and phone numbers of callers with random numbers to protect their privacy.

Bengtsson’s idea worked. The analysis wasn’t completed or verified quickly enough to help people in Haiti at the time, but in 2012, he and his collaborators reported that the population of Haiti’s capital, Port-au-Prince, dipped by almost one-quarter soon after the quake, and slowly rose over the next 11 months1. That result aligned with an intensive, on-the-ground survey conducted by the United Nations.

Humanitarians and researchers were thrilled. Telecommunications companies scrutinize call-detail records to learn about customers’ locations and phone habits and improve their services. Researchers suddenly realized that this sort of information might help them to improve lives. Even basic population statistics are murky in low-income countries where expensive household surveys are infrequent, and where many people don’t have smartphones, credit cards and other technologies that leave behind a digital trail, making remote-tracking methods used in richer countries too patchy to be useful.

Since the earthquake, scientists working under the rubric of ‘data for good’ have analysed calls from tens of millions of phone owners in Pakistan, Bangladesh, Kenya and at least two dozen other low- and middle-income nations. Humanitarian groups say that they’ve used the results to deliver aid. And researchers have combined call records with other information to try to predict how infectious diseases travel, and to pinpoint locations of poverty, social isolation, violence and more (see ‘Phone calls for good’)….(More)”.

Does Aid Effectiveness Differ per Political Ideologies?

Paper by Vincent Tawiah, Barnes Evans and Abdulrasheed Zakari: “Despite the extensive empirical literature on aid effectiveness, existing studies have not addressed directly how political ideology affects the use of foreign aid in the recipient country. This study, therefore, uses a unique dataset of 12 democratic countries in Africa to investigate the impact of political ideologies on aid effectiveness. Our results indicate that each political party uses aid differently in peruse of their political, ideological orientation. Further analyses suggest that rightist capitalist parties are likely to use aid to improve the private sector environment. Leftist socialist on the other hand, use aid effectively on pro-poor projects such as short-term poverty reduction, mass education and health services. Our additional analysis on the lines of colonialisation shows that the difference in the use of aid by political parties is much stronger in French colonies than Britain colonies. The study provides insight on how the recipient government are likely to use foreign aid….(More)”.

Data to the rescue

Podcast by Kenneth Cukier: “Access to the right data can be as valuable in humanitarian crises as water or medical care, but it can also be dangerous. Misused or in the wrong hands, the same information can put already vulnerable people at further risk. Kenneth Cukier hosts this special edition of Babbage examining how humanitarian organisations use data and what they can learn from the profit-making tech industry. This episode was recorded live from Wilton Park, in collaboration with the United Nations OCHA Centre for Humanitarian Data…(More)”.

Data Collaboration for the Common Good: Enabling Trust and Innovation Through Public-Private Partnerships

World Economic Forum Report: “As the digital technologies of the Fourth Industrial Revolution continue to drive change throughout all sectors of the global economy, a unique moment exists to create a more inclusive, innovative and resilient society. Central to this change is the use of data. It is abundantly available but if improperly used will be the source of dangerous and unwelcome results.

When data is shared, linked and combined across sectoral and institutional boundaries, a multiplier effect occurs. Connecting one bit with another unlocks new insights and understandings that often weren’t anticipated. Yet, due to commercial limits and liabilities, the full value of data is often unrealized. This is particularly true when it comes to using data for the common good. While public-private data collaborations represent an unprecedented opportunity to address some of the world’s most urgent and complex challenges, they have generally been small and limited in impact. An entangled set of legal, technical, social, ethical and commercial risks have created an environment where the incentives for innovation have stalled. Additionally, the widening lack of trust among individuals and institutions creates even more uncertainty. After nearly a decade of anticipation on the promise of public-private data collaboration – with relatively few examples of success at global scale – a pivotal moment has arrived to encourage progress and move forward….(More)”

(See also

Pitfalls of Aiming to Empower the Bottom from the Top: The Case of Philippine Participatory Budgeting

Paper by Joy Aceron: “… explains why and how a reform program that opened up spaces for participatory budgeting was ultimately unable to result in pro-citizen power shifts that transformed governance. The study reviews the design and implementation of Bottom-Up Budgeting (BuB), the nationwide participatory budgeting (PB) program in the Philippines, which ran from 2012 to 2016 under the Benigno Aquino government. The findings underscore the importance of institutional design to participatory governance reforms. BuB’s goal was to transform local government by providing more space for civil society organizations (CSOs) to co-identify projects with the government and to take part in the budgeting process, but it did not strengthen CSO or grassroots capacity to hold their Local Government Units (LGUs) accountable.

The BuB design had features that delivered positive gains towards citizen empowerment, including: (1) providing equal seats for CSOs in the Local Poverty Reduction Action Team (LPRAT), which are formally mandated to select proposed projects (in contrast to the pre-existing Local Development Councils (LDCs), which have only 25 percent CSO representation); (2) CSOs identified their LPRAT representatives themselves (as opposed to local chief executives choosing CSO representatives, as in the LDCs); and (3) LGUs were mandated to follow participatory requirements to receive additional funding. However, several aspects of the institutional design shifted power from local governments to the central government. This had a “centralizing effect”…

This study argues that because of these design problems, BuB fell short in achieving its main political reform agenda of empowering the grassroots—particularly in enabling downward accountability that could have enabled lasting pro-citizen power shifts. It did not empower local civil society and citizens to become a countervailing force vis-à-vis local politicians in fiscal governance. BuB is a case of a reform that provided a procedural mechanism for civil society input into national agency decisions but was unable to improve government responsiveness. It provided civil society with ‘voice’, but was constrained in enabling ‘teeth’. Jonathan Fox (2014) refers to “voice” as citizen inputs, feedback and action, while “teeth” refer to the capacity of the state to respond to voice.

Finally, the paper echoes the results of other studies which find that PB programs become successful when complemented by other institutional and state democratic capacity-building reforms and when they are part of a broader progressive change agenda. The BuB experience suggests that to bolster citizen oversight, it is essential to invest sufficient support and resources in citizen empowerment and in creating an enabling environment for citizen oversight….(More)”.

Opportunities and Challenges of Emerging Technologies for the Refugee System

Research Paper by Roya Pakzad: “Efforts are being made to use information and communications technologies (ICTs) to improve accountability in providing refugee aid. However, there remains a pressing need for increased accountability and transparency when designing and deploying humanitarian technologies. This paper outlines the challenges and opportunities of emerging technologies, such as machine learning and blockchain, in the refugee system.

The paper concludes by recommending the creation of quantifiable metrics for sharing information across both public and private initiatives; the creation of the equivalent of a “Hippocratic oath” for technologists working in the humanitarian field; the development of predictive early-warning systems for human rights abuses; and greater accountability among funders and technologists to ensure the sustainability and real-world value of humanitarian apps and other digital platforms….(More)”

The State of Open Data

Open Access Book edited by Tim Davies, Stephen B. Walker, Mor Rubinstein and Fernando Perini: “It’s been ten years since open data first broke onto the global stage. Over the past decade, thousands of programmes and projects around the world have worked to open data and use it to address a myriad of social and economic challenges. Meanwhile, issues related to data rights and privacy have moved to the centre of public and political discourse. As the open data movement enters a new phase in its evolution, shifting to target real-world problems and embed open data thinking into other existing or emerging communities of practice, big questions still remain. How will open data initiatives respond to new concerns about privacy, inclusion, and artificial intelligence? And what can we learn from the last decade in order to deliver impact where it is most needed? 

The State of Open Data brings together over 60 authors from around the world to address these questions and to take stock of the real progress made to date across sectors and around the world, uncovering the issues that will shape the future of open data in the years to come….(More)”.

The Third Pillar: How Markets and the State leave the Community behind

Book by Raghuram Rajan: “….In The Third Pillar he offers up a magnificent big-picture framework for understanding how these three forces–the state, markets, and our communities–interact, why things begin to break down, and how we can find our way back to a more secure and stable plane. 

The “third pillar” of the title is the community we live in. Economists all too often understand their field as the relationship between markets and the state, and they leave squishy social issues for other people. That’s not just myopic, Rajan argues; it’s dangerous. All economics is actually socioeconomics – all markets are embedded in a web of human relations, values and norms. As he shows, throughout history, technological phase shifts have ripped the market out of those old webs and led to violent backlashes, and to what we now call populism. Eventually, a new equilibrium is reached, but it can be ugly and messy, especially if done wrong. 

Right now, we’re doing it wrong. As markets scale up, the state scales up with it, concentrating economic and political power in flourishing central hubs and leaving the periphery to decompose, figuratively and even literally. Instead, Rajan offers a way to rethink the relationship between the market and civil society and argues for a return to strengthening and empowering local communities as an antidote to growing despair and unrest. Rajan is not a doctrinaire conservative, so his ultimate argument that decision-making has to be devolved to the grass roots or our democracy will continue to wither, is sure to be provocative. But even setting aside its solutions, The Third Pillar is a masterpiece of explication, a book that will be a classic of its kind for its offering of a wise, authoritative and humane explanation of the forces that have wrought such a sea change in our lives….(More)”.

AI & Global Governance: Robots Will Not Only Wage Future Wars but also Future Peace

Daanish Masood & Martin Waehlisch at the United Nations University: “At the United Nations, we have been exploring completely different scenarios for AI: its potential to be used for the noble purposes of peace and security. This could revolutionize the way of how we prevent and solve conflicts globally.

Two of the most promising areas are Machine Learning and Natural Language Processing. Machine Learning involves computer algorithms detecting patterns from data to learn how to make predictions and recommendations. Natural Language Processing involves computers learning to understand human languages.

At the UN Secretariat, our chief concern is with how these emerging technologies can be deployed for the good of humanity to de-escalate violence and increase international stability.

This endeavor has admirable precedent. During the Cold War, computer scientists used multilayered simulations to predict the scale and potential outcome of the arms race between the East and the West.

Since then, governments and international agencies have increasingly used computational models and advanced Machine Learning to try to understand recurrent conflict patterns and forecast moments of state fragility.

But two things have transformed the scope for progress in this field.

The first is the sheer volume of data now available from what people say and do online. The second is the game-changing growth in computational capacity that allows us to crunch unprecedented, inconceivable quantities data with relative speed and ease.

So how can this help the United Nations build peace? Three ways come to mind.

Firstly, overcoming cultural and language barriers. By teaching computers to understand human language and the nuances of dialects, not only can we better link up what people write on social media to local contexts of conflict, we can also more methodically follow what people say on radio and TV. As part of the UN’s early warning efforts, this can help us detect hate speech in a place where the potential for conflict is high. This is crucial because the UN often works in countries where internet coverage is low, and where the spoken languages may not be well understood by many of its international staff.

Natural Language Processing algorithms can help to track and improve understanding of local debates, which might well be blind spots for the international community. If we combine such methods with Machine Learning chatbots, the UN could conduct large-scale digital focus groups with thousands in real-time, enabling different demographic segments in a country to voice their views on, say, a proposed peace deal – instantly testing public support, and indicating the chances of sustainability.

Secondly, anticipating the deeper drivers of conflict. We could combine new imaging techniques – whether satellites or drones – with automation. For instance, many parts of the world are experiencing severe groundwater withdrawal and water aquifer depletion. Water scarcity, in turn, drives conflicts and undermines stability in post-conflict environments, where violence around water access becomes more likely, along with large movements of people leaving newly arid areas.

One of the best predictors of water depletion is land subsidence or sinking, which can be measured by satellite and drone imagery. By combining these imaging techniques with Machine Learning, the UN can work in partnership with governments and local communities to anticipate future water conflicts and begin working proactively to reduce their likelihood.

Thirdly, advancing decision making. In the work of peace and security, it is surprising how many consequential decisions are still made solely on the basis of intuition.

Yet complex decisions often need to navigate conflicting goals and undiscovered options, against a landscape of limited information and political preference. This is where we can use Deep Learning – where a network can absorb huge amounts of public data and test it against real-world examples on which it is trained while applying with probabilistic modeling. This mathematical approach can help us to generate models of our uncertain, dynamic world with limited data.

With better data, we can eventually make better predictions to guide complex decisions. Future senior peace envoys charged with mediating a conflict would benefit from such advances to stress test elements of a peace agreement. Of course, human decision-making will remain crucial, but would be informed by more evidence-driven robust analytical tools….(More)”.