OpenStreetMap in Israel and Palestine – ‘Game changer’ or reproducer of contested cartographies?


Christian Bittner in Political Geography: “In Israel and Palestine, map-making practices were always entangled with contradictive spatial identities and imbalanced power resources. Although an Israeli narrative has largely dominated the ‘cartographic battlefield’, the latest chapter of this story has not been written yet: collaborative forms of web 2.0 cartographies have restructured power relations in mapping practices and challenged traditional monopolies on map and spatial data production. Thus, we can expect web 2.0 cartographies to be a ‘game changer’ for cartography in Palestine and Israel.

In this paper, I review this assumption with the popular example of OpenStreetMap (OSM). Following a mixed methods approach, I comparatively analyze the genesis of OSM in Israel and Palestine. Although nationalist motives do not play a significant role on either side, it turns out that the project is dominated by Israeli and international mappers, whereas Palestinians have hardly contributed to OSM. As a result, social fragmentations and imbalances between Israel and Palestine are largely reproduced through OSM data. Discussing the low involvement of Palestinians, I argue that OSM’s ground truth paradigm might be a watershed for participation. Presumably, the project’s data are less meaningful in some local contexts than in others. Moreover, the seemingly apolitical approach to map only ‘facts on the ground’ reaffirms present spatio-social order and thus the power relations behind it. Within a Palestinian narrative, however, many aspects of the factual material space might appear not as neutral physical objects but as results of suppression, in which case, any ‘accurate’ spatial representation, such as OSM, becomes objectionable….(More)”

Using open government for climate action


Elizabeth Moses at Eco-Business: “Countries made many national climate commitments as part of the Paris Agreement on climate change, which entered into force earlier this month. Now comes the hard part of implementing those commitments. The public can serve an invaluable watchdog role, holding governments accountable for following through on their targets and making sure climate action happens in a way that’s fair and inclusive.

But first, the climate and open government communities will need to join forces….

Here are four areas where these communities can lean in together to ensure governments follow through on effective climate action:

1) Expand access to climate data and information.

Open government and climate NGOs and local communities can expand the use of traditional transparency tools and processes such as Freedom of Information (FOI) laws, transparent budgeting, open data policies and public procurement to enhance open information on climate mitigation, adaptation and finance.

For example, Transparencia Mexicana used Mexico’s Freedom of Information Law to collect data to map climate finance actors and the flow of finance in the country. This allows them to make specific recommendations on how to safeguard climate funds against corruption and ensure the money translates into real action on the ground….

2) Promote inclusive and participatory climate policy development.

Civil society and community groups already play a crucial role in advocating for climate action and improving climate governance at the national and local levels, especially when it comes to safeguarding poor and vulnerable people, who often lack political voice….

3) Take legal action for stronger accountability.

Accountability at a national level can only be achieved if grievance mechanisms are in place to address a lack of transparency or public participation, or address the impact of projects and policies on individuals and communities.

Civil society groups and individuals can use legal actions like climate litigation, petitions, administrative policy challenges and court cases at the national, regional or international levels to hold governments and businesses accountable for failing to effectively act on climate change….

4) Create new spaces for advocacy.

Bringing the climate and open government movements together allows civil society to tap new forums for securing momentum around climate policy implementation. For example, many civil society NGOs are highlighting the important connections between a strong Governance Goal 16 under the 2030 Agenda for Sustainable Development, and strong water quality and climate change policies….(More)”

Data can become Nigeria’s new ‘black gold’


Labi Ogunbiyi in the Financial Times: “In the early 2000s I decided to leave my job heading the African project finance team in a global law firm to become an investor. My experience of managing big telecoms, infrastructure and energy transactions — and, regrettably, often disputes — involving governments, project sponsors, investors, big contractors, multilateral and development agencies had left me dissatisfied. Much of the ownership of the assets being fought over remained in the hands of international conglomerates. Africa’s lack of capacity to raise the capital to own them directly — and to develop the technical skills necessary for growth — was a clear weakness…

Yet, nearly 15 years after the domestic oil and gas sector began to evolve, oil is no longer the country’s only “black gold”. If I take a comparative look at how Nigeria’s energy sector has evolved since the early 2000’s, compared with how its ICT and broader technology industry has emerged, and the opportunities that both represent for the future, the contrast is stark. Nigeria, and the rest of the continent, has been enjoying a technology revolution and the opportunity that it represents has the potential to affect every sector of the economy. According to Africa Infotech Consulting, Nigeria’s mobile penetration rate — a measure of the number of devices by population — is more than 90 per cent, less than 20 years after the first mobile network appeared on the continent. Recent reports suggest more than 10 per cent of Nigerians have a smartphone. The availability and cost of fast data have improved dramatically….(More)”

New Data Portal to analyze governance in Africa


Africa’s health won’t improve without reliable data and collaboration


 and  at the Conversation: “…Africa has a data problem. This is true in many sectors. When it comes to health there’s both a lack of basic population data about disease and an absence of information about what impact, if any, interventions involving social determinants of health – housing, nutrition and the like – are having.

Simply put, researchers often don’t know who is sick or what people are being exposed to that, if addressed, could prevent disease and improve health. They cannot say if poor sanitation is the biggest culprit, or if substandard housing in a particular region is to blame. They don’t have the data that explains which populations are most vulnerable.

These data are required to inform development of innovative interventions that apply a “Health in All Policies” approach to address social determinants of health and improve health equity.

To address this, health data need to be integrated with social determinant data about areas like food, housing, and physical activity or mobility. Even where population data are available, they are not always reliable. There’s often an issue of compatability: different sectors collect different kinds of information using varying methodologies.

Different sectors also use different indicators to collect information on the same social determinant of health. This makes data integration challenging.

Without clear, focused, reliable data it’s difficult to understand what a society’s problems are and what specific solutions – which may lie outside the health sector – might be suitable for that unique context.

Scaling up innovations

Some remarkable work is being done to tackle Africa’s health problems. This ranges from technological innovations to harnessing indigenous knowledge for change. Both approaches are vital. But it’s hard for these to be scaled up either in terms of numbers or reach.

This boils down to a lack of funding or a lack of access to funding. Too many potentially excellent projects remain stuck at the pilot phase, which has limited value for ordinary people…..

Governments need to develop health equity surveillance systems to overcome the current lack of data. It’s also crucial that governments integrate and monitor health and social determinants of health indicators in one central system. This would provide a better understanding of health inequity in a given context.

For this to happen, governments must work with public and private sector stakeholders and nongovernmental organisations – not just in health, but beyond it so that social determinants of health can be better measured and captured.

The data that already exists at sub-national, national, regional and continental level mustn’t just be brushed aside. It should be archived and digitised so that it isn’t lost.

Researchers have a role to play here. They have to harmonise and be innovative in the methodologies they use for data collection. If researchers can work together across the breadth of sectors and disciplines that influence health, important information won’t slip through the cracks.

When it comes to scaling up innovation, governments need to step up to the plate. It’s crucial that they support successful health innovations, whether these are rooted in indigenous knowledge or are new technologies. And since – as we’ve already shown – health issues aren’t the exclusive preserve of the health sector, governments should look to different sectors and innovative partnerships to generate support and funding….(More)”

Towards a DataPlace: mapping data in a game to encourage participatory design in smart cities


Paper by Barker, Matthew; Wolff, Annika and van der Linden, Janet: “The smart city has been envisioned as a place where citizens can participate in city decision making and in the design of city services. As a key part of this vision, pervasive digital technology and open data legislation are being framed as vehicles for citizens to access rich data about their city. It has become apparent though, that simply providing access to these resources does not automatically lead to the development of data-driven applications. If we are going to engage more of the citizenry in smart city design and raise productivity, we are going to need to make the data itself more accessible, engaging and intelligible for non-experts. This ongoing study is exploring one method for doing so. As part of the MK:Smart City project team, we are developing a tangible data look-up interface that acts as an alternative to the conventional DataBase. This interface, or DataPlace as we are calling it, takes the form of a map, which the user places sensors on to physically capture real-time data. This is a simulation of the physical act of capturing data in the real world. We discuss the design of the DataPlace prototype under development and the planned user trials to test out our hypothesis; that a DataPlace can make handling data more accessible, intelligible and engaging for non-experts than conventional interface types….(More)”

New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

New Institute Pushes the Boundaries of Big Data


Press Release: “Each year thousands of genomes are sequenced, millions of neuronal activity traces are recorded, and light from hundreds of millions of galaxies is captured by our newest telescopes, all creating datasets of staggering size. These complex datasets are then stored for analysis.

Ongoing analysis of these information streams has illuminated a problem, however: Scientists’ standard methodologies are inadequate to the task of analyzing massive quantities of data. The development of new methods and software to learn from data and to model — at sufficient resolution — the complex processes they reflect is now a pressing concern in the scientific community.

To address these challenges, the Simons Foundation has launched a substantial new internal research group called the Flatiron Institute (FI). The FI is the first multidisciplinary institute focused entirely on computation. It is also the first center of its kind to be wholly supported by private philanthropy, providing a permanent home for up to 250 scientists and collaborating expert programmers all working together to create, deploy and support new state-of-the-art computational methods. Few existing institutions support the combination of scientists and programmers, instead leaving programming to relatively impermanent graduate students and postdoctoral fellows, and none have done so at the scale of the Flatiron Institute or with such a broad scope, at a single location.

The institute will hold conferences and meetings and serve as a focal point for computational science around the world….(More)”.

Governance and Service Delivery: Practical Applications of Social Accountability Across Sectors


Book edited by Derick W. Brinkerhoff, Jana C. Hertz, and Anna Wetterberg: “…Historically, donors and academics have sought to clarify what makes sectoral projects effective and sustainable contributors to development. Among the key factors identified have been (1) the role and capabilities of the state and (2) the relationships between the state and citizens, phenomena often lumped together under the broad rubric of “governance.” Given the importance of a functioning state and positive interactions with citizens, donors have treated governance as a sector in its own right, with projects ranging from public sector management reform, to civil society strengthening, to democratization (Brinkerhoff, 2008). The link between governance and sectoral service delivery was highlighted in the World Bank’s 2004 World Development Report, which focused on accountability structures and processes (World Bank, 2004).

Since then, sectoral specialists’ awareness that governance interventions can contribute to service delivery improvements has increased substantially, and there is growing recognition that both technical and governance elements are necessary facets of strengthening public services. However, expanded awareness has not reliably translated into effective integration of governance into sectoral programs and projects in, for example, health, education, water, agriculture, or community development. The bureaucratic realities of donor programming offer a partial explanation…. Beyond bureaucratic barriers, though, lie ongoing gaps in practical knowledge of how best to combine attention to governance with sector-specific technical investments. What interventions make sense, and what results can reasonably be expected? What conditions support or limit both improved governance and better service delivery? How can citizens interact with public officials and service providers to express their needs, improve services, and increase responsiveness? Various models and compilations of best practices have been developed, but debates remain, and answers to these questions are far from settled. This volume investigates these questions and contributes to building understanding that will enhance both knowledge and practice. In this book, we examine six recent projects, funded mostly by the United States Agency for International Development and implemented by RTI International, that pursued several different paths to engaging citizens, public officials, and service providers on issues related to accountability and sectoral services…(More)”

What’s wrong with big data?


James Bridle in the New Humanist: “In a 2008 article in Wired magazine entitled “The End of Theory”, Chris Anderson argued that the vast amounts of data now available to researchers made the traditional scientific process obsolete. No longer would they need to build models of the world and test them against sampled data. Instead, the complexities of huge and totalising datasets would be processed by immense computing clusters to produce truth itself: “With enough data, the numbers speak for themselves.” As an example, Anderson cited Google’s translation algorithms which, with no knowledge of the underlying structures of languages, were capable of inferring the relationship between them using extensive corpora of translated texts. He extended this approach to genomics, neurology and physics, where scientists are increasingly turning to massive computation to make sense of the volumes of information they have gathered about complex systems. In the age of big data, he argued, “Correlation is enough. We can stop looking for models.”

This belief in the power of data, of technology untrammelled by petty human worldviews, is the practical cousin of more metaphysical assertions. A belief in the unquestionability of data leads directly to a belief in the truth of data-derived assertions. And if data contains truth, then it will, without moral intervention, produce better outcomes. Speaking at Google’s private London Zeitgeist conference in 2013, Eric Schmidt, Google Chairman, asserted that “if they had had cellphones in Rwanda in 1994, the genocide would not have happened.” Schmidt’s claim was that technological visibility – the rendering of events and actions legible to everyone – would change the character of those actions. Not only is this statement historically inaccurate (there was plenty of evidence available of what was occurring during the genocide from UN officials, US satellite photographs and other sources), it’s also demonstrably untrue. Analysis of unrest in Kenya in 2007, when over 1,000 people were killed in ethnic conflicts, showed that mobile phones not only spread but accelerated the violence. But you don’t need to look to such extreme examples to see how a belief in technological determinism underlies much of our thinking and reasoning about the world.

“Big data” is not merely a business buzzword, but a way of seeing the world. Driven by technology, markets and politics, it has come to determine much of our thinking, but it is flawed and dangerous. It runs counter to our actual findings when we employ such technologies honestly and with the full understanding of their workings and capabilities. This over-reliance on data, which I call “quantified thinking”, has come to undermine our ability to reason meaningfully about the world, and its effects can be seen across multiple domains.

The assertion is hardly new. Writing in the Dialectic of Enlightenment in 1947, Theodor Adorno and Max Horkheimer decried “the present triumph of the factual mentality” – the predecessor to quantified thinking – and succinctly analysed the big data fallacy, set out by Anderson above. “It does not work by images or concepts, by the fortunate insights, but refers to method, the exploitation of others’ work, and capital … What men want to learn from nature is how to use it in order wholly to dominate it and other men. That is the only aim.” What is different in our own time is that we have built a world-spanning network of communication and computation to test this assertion. While it occasionally engenders entirely new forms of behaviour and interaction, the network most often shows to us with startling clarity the relationships and tendencies which have been latent or occluded until now. In the face of the increased standardisation of knowledge, it becomes harder and harder to argue against quantified thinking, because the advances of technology have been conjoined with the scientific method and social progress. But as I hope to show, technology ultimately reveals its limitations….

“Eroom’s law” – Moore’s law backwards – was recently formulated to describe a problem in pharmacology. Drug discovery has been getting more expensive. Since the 1950s the number of drugs approved for use in human patients per billion US dollars spent on research and development has halved every nine years. This problem has long perplexed researchers. According to the principles of technological growth, the trend should be in the opposite direction. In a 2012 paper in Nature entitled “Diagnosing the decline in pharmaceutical R&D efficiency” the authors propose and investigate several possible causes for this. They begin with social and physical influences, such as increased regulation, increased expectations and the exhaustion of easy targets (the “low hanging fruit” problem). Each of these are – with qualifications – disposed of, leaving open the question of the discovery process itself….(More)