An Introduction to System Mapping


Joelle Cook at FSG: “Much has been written recently about the importance of using a system lens, or focusing on system change, to make real progress against some of society’s toughest challenges. But what does that really mean? The following definition of system change resonated with us, fromNPC’s 2015 guide:

“Systems change is an intentional process designed to alter the status quo by shifting the function or structure of an identified system with purposeful interventions. It is a journey which can require a radical change in people’s attitudes as well as in the ways people work. Systems change aims to bring about lasting change by altering underlying structures and supporting mechanisms which make the system operate in a particular way. These can include policies, routines, relationships, resources, power structures and values.”

However, to change the system, you need to first understand the system, and mapping is a great way to do that. A “system,” as described by Julia Coffman in her 2007 framework for evaluating system change, is “a group of interacting, interrelated, and interdependent components that form a complex and unified whole.” A system’s overall purpose or goal is achieved through the actions and interactions of its components.

As you can imagine, there are a number of different ways you might approach mapping the system to represent system elements and connections. For example, you might create:

  • Actor maps, to show which individuals and/or organizations are key players in the space and how they are connected
  • Mind maps, that highlight various trends in the external environment that influence the issue at hand
  • Issue maps, which lay out the political, social, or economic issues affecting a given geography or constituency (often used by advocacy groups)
  • Causal-loop diagrams, that  focus on explicating the feedback loops (positive and negative) that lead to system behavior or functioning

For more information, see a blog from Innovation Network on systems mapping, Jeff Wasbes’blog on causal loop diagrams, and an example from the Hewlett Foundation’s Madison Initiative….(More)”

Open Data as Open Educational Resources: Case studies of emerging practice


Book edited by Javiera Atenas and Leo Havemann: “…is the outcome of a collective effort that has its origins in the 5th Open Knowledge Open Education Working Group call, in which the idea of using Open Data in schools was mentioned. It occurred to us that Open Data and open educational resources seemed to us almost to exist in separate open worlds.

We decided to seek out evidence in the use of open data as OER, initially by conducting a bibliographical search. As we could not find published evidence, we decided to ask educators if they were in fact, using open data in this way, and wrote a post for this blog (with Ernesto Priego) explaining our perspective, called The 21st Century’s Raw Material: Using Open Data as Open Educational Resources. We ended the post with a link to an exploratory survey, the results of which indicated a need for more awareness of the existence and potential value of Open Data amongst educators…..

the case studies themselves. They have been provided by scholars and practitioners from different disciplines and countries, and they reflect different approaches to the use of open data. The first case study presents an approach to educating both teachers and students in the use of open data for civil monitoring via Scuola di OpenCoesione in Italy, and has been written by Chiara Ciociola and Luigi Reggi. The second case, by Tim Coughlan from the Open University, UK, showcases practical applications in the use of local and contextualised open data for the development of apps. The third case, written by Katie Shamash, Juan Pablo Alperin & Alessandra Bordini from Simon Fraser University, Canada, demonstrates how publishing students can engage, through data analysis, in very current debates around scholarly communications and be encouraged to publish their own findings. The fourth case by Alan Dix from Talis and University of Birmingham, UK, and Geoffrey Ellis from University of Konstanz, Germany, is unique because the data discussed in this case is self-produced, indeed ‘quantified self’ data, which was used with students as material for class discussion and, separately, as source data for another student’s dissertation project. Finally, the fifth case, presented by Virginia Power from University of the West of England, UK, examines strategies to develop data and statistical literacies in future librarians and knowledge managers, aiming to support and extend their theoretical understanding of the concept of the ‘knowledge society’ through the use of Open Data….(More)

The book can be downloaded here Open Data as Open Educational Resources

Role of Citizens in India’s Smart Cities Challenge


Florence Engasser and Tom Saunders at the World Policy Blog: “India faces a wide range of urban challenges — from serious air pollution and poor local governance, to badly planned cities and a lack of decent housing. India’s Smart Cities Challenge, which has now selected 98 of the 100 cities that will receive funding, could go a long way in addressing these issues.

According to Prime Minister Narendra Modi, there are five key instruments that make a “smart” city: the use of clean technologies, the use of information and communications technology (ICT), private sector involvement, citizen participation and smart governance. There are good examples of new practices for each of these pillars.

For example, New Delhi recently launched a program to replace streetlights with energy efficient LEDs. The Digital India program is designed to upgrade the country’s IT infrastructure and includes plans to build “broadband highways” across the country. As for private sector participation, the Indian government is trying to encourage it by listing sectors and opportunities for public-private partnerships.

Citizen participation is one of Modi’s five key instruments, but this is an area where smart city pilots around the world have tended to perform least well on. While people are the implied beneficiaries of programs that aim to improve efficiency and reduce waste, they are rarely given a chance to participate in the design or delivery of smart city projects, which are usually implemented and managed by experts who have only a vague idea of the challenges that local communities face.

Citizen Participation

Engaging citizens is especially important in an Indian context because there have already been several striking examples of failed urban redevelopments that have blatantly lacked any type of community consultation or participation….

In practice, how can Indian cities engage residents in their smart city projects?

There are many tools available to policymakers — from traditional community engagement activities such as community meetings, to websites like Mygov.in that ask for feedback on policies. Now, there are a number of reasons to think smartphones could be an important tool to help improve collaboration between residents and city governments in Indian cities.

First, while only around 10 percent of Indians currently own a smartphone, this is predicted to rise to around half by 2020, and will be much higher in urban areas. A key driver of this is local manufacturing giants like Micromax, which have revolutionized low-cost technology in India, with smartphones costing as little as $30 (compared to around $800 for the newest iPhone).

Second, smartphone apps give city governments the potential to interact directly with citizens to make the most of what they know and feel about their communities. This can happen passively, for example, the Waze Connected Citizens program, which shares user location data with city governments to help improve transport planning. It can also be more active, for example, FixMyStreet, which allows people to report maintenance issues like potholes to their city government.

Third, smartphones are one of the main ways for people to access social media, and researchers are now developing a range of new and innovative solutions to address urban challenges using these platforms. This includes Petajakarta, which creates crowd-sourced maps of flooding in Jakarta by aggregating tweets that mention the word ‘flood.’

Made in India

Considering some of the above trends, it is interesting to think about the role smartphones could play in the governance of Indian cities and in better engaging communities. India is far from being behind in the field, and there are already a few really good examples of innovative smartphone applications made in India.

Swachh Bharat Abhiyan (translated as Clean India Initiative) is a campaign launched by Modi in October 2014, covering over 4,000 towns all over the country, with the aim to clean India’s streets. The Clean India mobile application, launched at the end of 2014 to coincide with Modi’s initiative, was developed by Mahek Shah and allows users to take pictures to report, geo-locate, and timestamp streets that need cleaning or problems to be fixed by the local authorities.

Similar to FixMyStreet, users are able to tag their reports with keywords to categorize problems. Today, Clean India has been downloaded over 12,000 times and has 5,000 active users. Although still at a very early stage, Clean India has great potential to facilitate the complaint and reporting process by empowering people to become the eyes and ears of municipalities on the ground, who are often completely unaware of issues that matter to residents.

In Bangalore, an initiative by the MOD Institute, a local nongovernmental organization, enabled residents to come together, online and offline, to create a community vision for the redevelopment of Shanthinagar, a neighborhood of the city. The project, Next Bengaluru, used new technologies to engage local residents in urban planning and tap into their knowledge of the area to promote a vision matching their real needs.

The initiative was very successful. In just three months, between December 2014 and March 2015, over 1,200 neighbors and residents visited the on-site community space, and the team crowd-sourced more than 600 ideas for redevelopment and planning both on-site and through the Next Bangalore website.

The MOD Institute now intends to work with local urban planners to try get these ideas adopted by the city government. The project has also developed a pilot app that will enable people to map abandoned urban spaces via smartphone and messaging service in the future.

Finally, Safecity India is a nonprofit organization providing a platform for anyone to share, anonymously or not, personal stories of sexual harassment and abuse in public spaces. Men and women can report different types of abuses — from ogling, whistles and comments, to stalking, groping and sexual assault. The aggregated data is then mapped, allowing citizens and governments to better understand crime trends at hyper-local levels.

Since its launch in 2012, SafeCity has received more than 4,000 reports of sexual crime and harassment in over 50 cities across India and Nepal. SafeCity helps generate greater awareness, breaks the cultural stigma associated with reporting sexual abuse and gives voice to grassroots movements and campaigns such as SayftyProtsahan, or Stop Street Harassment, forcing authorities to take action….(More)

Cleaning Up Lead Poisoning One Tweet at a Time


WorldPolicy Blog: “At first, no one knew why the children of Bagega in Zamfara state were dying. In the spring of 2010, hundreds of kids in and around the northern Nigerian village were falling ill, having seizures and going blind, many of them never to recover. A Médecins Sans Frontières‎ team soon discovered the causes: gold and lead.

With the global recession causing the price of precious metals to soar, impoverished villagers had turned to mining the area’s gold deposits. But the gold veins were mingled with lead, and as a result the villagers’ low-tech mining methods were sending clouds of lead-laced dust into the air. The miners, unknowingly carrying the powerful toxin on their clothes and skin, brought it into their homes where their children breathed it in.

The result was perhaps the worst outbreak of lead poisoning in history, killing over 400 children in Bagega and neighboring villages. In response, the Nigerian government pledged to cleanup the lead-contaminated topsoil and provide medical care to the stricken children. But by mid-2012, there was no sign of the promised funds. Digitally savvy activists with the organization Connected Development (CODE) stepped in to make sure that the money was disbursed.

A group of young Nigerians founded CODE in 2010 in the capital Abuja, with the mission of empowering local communities to hold the government to account by improving their access to information and helping their voices to be heard. “In 2010, we were working to connect communities with data for advocacy programs,” says CODE co-founder Oludotun Babayemi, a former country director of a World Wildlife Fund project in Nigeria. “When we heard about Bagega, we thought this was an opportunity for us.”

In 2012, CODE launched a campaign dubbed ‘Follow the Money Nigeria’ aimed at applying pressure on the government to release the promised funds. “Eighty percent of the less developed parts of Nigeria have zero access to Twitter, let alone Facebook, so it’s difficult for them to convey their stories,” says Babayemi. “We collect all the videos and testimonies and take it global.”

CODE members travelled to the lead-afflicted area to gather information. They then posted their findings online, and publicized them with a #SaveBagegahashtag, which they tweeted to members of the government, local and international organizations and the general public. CODE hosted a 48-hour ‘tweet-a-thon’, joined by a senator, to support the campaign….

By July 2014, CODE reported that the clean-up was complete and that over 1,000 children had been screened and enrolled in lead treatment programs. Bagega’s health center has also been refurbished and the village’s roads improved. “There are thousands of communities like Bagega,” says Babayemi. “They just need someone to amplify their voice.”….

Key lessons

  • Revealing information is not enough; change requires a real-world campaign driven by that information and civil society champions who can leverage their status and networks to draw international attention to the issues and maintain pressure.
  • Building relationships with sympathetic members of government is key.
  • Targeted online campaigns can help amplify the message of marginalized communities offline to achieve impact (More)”

Git for Law Revisited


Ari Hershowitz at Linked Legislation: “Laws change. Each time a new U.S. law is enacted, it enters a backdrop of approximately 22 million words of existing law. The new law may strike some text, add some text, and make other adjustments that trickle through the legal corpus. Seeing these changes in context would help lawmakers and the public better understand their impact.

To software engineers, this problem sounds like a perfect application for automated change management. Input an amendment, output tracked changes (see sample below). In the ideal system such changes could be seen as soon as the law is enacted — or even while a bill is being debated. We are now much closer to this ideal.

Changes to 16 U.S.C. 3835 by law 113-79

On Quora, on this blog, and elsewhere, I’ve discussed some of the challenges to using git, an automated change management system, to track laws. The biggest technical challenge has been that most laws, and most amendments to those laws, have not been structured in a computer friendly way. But that is changing.

The Law Revision Counsel (LRC) compiles the U.S. Code, through careful analysis of new laws, identifying the parts of existing law that will be changed (in a process called Classification), and making those changes by hand. The drafting and revision process takes great skill and legal expertise.

So, for example, the LRC makes changes to the current U.S. Code, following the language of a law such as this one:

Sample provision, 113-79 section 2006(a)

LRC attorneys identify the affected provisions of the U.S. Code and then carry out each of these instructions (strike “The Secretary”, insert “During fiscal year”…”). Since 2011, the LRC is using and publishing the final result of this analysis in XML format. One of the consequences of this format change is that it becomes feasible to automatically match the “before” to the “after” text, and produce a redlined version as seen above, showing the changes in context.

To produce this redlined version, I ran xml_diff, an open-source program written by Joshua Tauberer of govtrack.us, who also works with my company, Xcential, on modernization projects for the U.S. House. The results can be remarkably accurate. As a pre-requisite, it is necessary to have a “before” and “after” version in XML format and a small enough stretch of text to make the comparison manageable….(More)”

Room for a View: Democracy as a Deliberative System


Involve: “Democratic reform comes in waves, propelled by technological, economic, political and social developments. There are periods of rapid change, followed by relative quiet.

We are currently in a period of significant political pressure for change to our institutions of democracy and government. With so many changes under discussion it is critically important that those proposing and carrying out reforms understand the impact that different reforms might have.

Most discussions of democratic reform focus on electoral democracy. However, for all their importance in the democratic system, elections rarely reveal what voters think clearly enough for elected representatives to act on them. Changing the electoral system will not alone significantly increase the level of democratic control held by citizens.

Room for a View, by Involve’s director Simon Burall, looks at democratic reform from a broader perspective than that of elections. Drawing on the work of democratic theorists, it uses a deliberative systems approach to examine the state of UK democracy. Rather than focusing exclusively on the extent to which individuals and communities are represented within institutions, it is equally concerned with the range of views present and how they interact.

Adapting the work of the democratic theorist John Dryzek, the report identifies seven components of the UK’s democratic system, describing and analysing the condition of each in turn. Assessing the UK’s democracy though this lens reveals it to be in fragile health. The representation of alternative views and narratives in all of the UK system’s seven components is poor, the components are weakly connected and, despite some positive signs, deliberative capacity is decreasing.

Room for a View suggests that a focus on the key institutions isn’t enough. If the health of UK democracy is to be improved, we need to move away from thinking about the representation of individual voters to thinking about the representation of views, perspectives and narratives. Doing this will fundamentally change the way we approach democratic reform.

Big data problems we face today can be traced to the social ordering practices of the 19th century.


Hamish Robertson and Joanne Travaglia in LSE’s The Impact Blog: “This is not the first ‘big data’ era but the second. The first was the explosion in data collection that occurred from the early 19th century – Hacking’s ‘avalanche of numbers’, precisely situated between 1820 and 1840. This was an analogue big data era, different to our current digital one but characterized by some very similar problems and concerns. Contemporary problems of data analysis and control include a variety of accepted factors that make them ‘big’ and these generally include size, complexity and technology issues. We also suggest that digitisation is a central process in this second big data era, one that seems obvious but which has also appears to have reached a new threshold. Until a decade or so ago ‘big data’ looked just like a digital version of conventional analogue records and systems. Ones whose management had become normalised through statistical and mathematical analysis. Now however we see a level of concern and anxiety, similar to the concerns that were faced in the first big data era.

This situation brings with it a socio-political dimension of interest to us, one in which our understanding of people and our actions on individuals, groups and populations are deeply implicated. The collection of social data had a purpose – understanding and controlling the population in a time of significant social change. To achieve this, new kinds of information and new methods for generating knowledge were required. Many ideas, concepts and categories developed during that first data revolution remain intact today, some uncritically accepted more now than when they were first developed. In this piece we draw out some connections between these two data ‘revolutions’ and the implications for the politics of information in contemporary society. It is clear that many of the problems in this first big data age and, more specifically, their solutions persist down to the present big data era….Our question then is how do we go about re-writing the ideological inheritance of that first data revolution? Can we or will we unpack the ideological sequelae of that past revolution during this present one? The initial indicators are not good in that there is a pervasive assumption in this broad interdisciplinary field that reductive categories are both necessary and natural. Our social ordering practices have influenced our social epistemology. We run the risk in the social sciences of perpetuating the ideological victories of the first data revolution as we progress through the second. The need for critical analysis grows apace not just with the production of each new technique or technology but with the uncritical acceptance of the concepts, categories and assumptions that emerged from that first data revolution. That first data revolution proved to be a successful anti-revolutionary response to the numerous threats to social order posed by the incredible changes of the nineteenth century, rather than the Enlightenment emancipation that was promised. (More)”

This is part of a wider series on the Politics of Data. For more on this topic, also see Mark Carrigan’sPhilosophy of Data Science interview series and the Discover Society special issue on the Politics of Data (Science).

In post-earthquake Nepal, open data accountability


Deepa Rai at the Worldbank blog: “….Following the earthquake, there was an overwhelming response from technocrats and data crunchers to use data visualizations for disaster risk assessment. The Government of Nepal made datasets available through its Disaster Data Portal and many organizations and individuals also pitched in and produced visual data platforms.
However, the use of open data has not been limited to disaster response. It was, and still is, instrumental in tracking how much funding has been received and how it’s being allocated. Through the use of open data, people can make their own analysis based on the information provided online.

Direct Relief, a not-for-profit company, has collected such information and helped gathered data from the Prime Minister’s relief fund and then created infographics which have been useful for media and immediate distribution on social platforms. MapJournal’s visual maps became vital during the Post Disaster Needs Assessment (PDNA) to assess and map areas where relief and reconstruction efforts were urgently needed.

Direct Relief Medical Relief partner locations
Direct Relief medical relief partner locations in context of population affected and injuries by district
Photo Credit: Data Relief Services

Open data and accountability
However, the work of open data doesn’t end with relief distribution and disaster risk assessment. It is also hugely impactful in keeping track of how relief money is pledged, allocated, and spent. One such web application,openenet.net is making this possible by aggregating post disaster funding data from international and national sources into infographics. “The objective of the system,” reads the website “is to ensure transparency and accountability of relief funds and resources to ensure that it reaches to targeted beneficiaries. We believe that transparency of funds in an open and accessible manner within a central platform is perhaps the first step to ensure effective mobilization of available resources.”
Four months after the earthquake, Nepali media have already started to report on aid spending — or the lack of it. This has been made possible by the use of open data from the Ministry of Home Affairs (MoHA) and illustrates how critical data is for the effective use of aid money.
Open data platforms emerging after the quakes have been crucial in questioning the accountability of aid provisions and ultimately resulting in more successful development outcomes….(More)”

How the USGS uses Twitter data to track earthquakes


Twitter Blog: “After the disastrous Sichuan earthquake in 2008, people turned to Twitter to share firsthand information about the earthquake. What amazed many was the impression that Twitter was faster at reporting the earthquake than the U.S. Geological Survey (USGS), the official government organization in charge of tracking such events.

This Twitter activity wasn’t a big surprise to the USGS. The USGS National Earthquake Information Center (NEIC) processes about 2,000 realtime earthquake sensors, with the majority based in the United States. That leaves a lot of empty space in the world with no sensors. On the other hand, there are hundreds of millions of people using Twitter who can report earthquakes. At first, the USGS staff was a bit skeptical that Twitter could be used as a detection system for earthquakes – but when they looked into it, they were surprised at the effectiveness of Twitter data for detection.

USGS staffers Paul Earle, a seismologist, and Michelle Guy, a software developer, teamed up to look at how Twitter data could be used for earthquake detection and verification. By using Twitter’s Public API, they decided to use the same time series event detection method they use when detecting earthquakes. This gave them a baseline for earthquake-related chatter, but they decided to dig in even further. They found that people Tweeting about actual earthquakes kept their Tweets really short, even just to ask, “earthquake?” Concluding that people who are experiencing earthquakes aren’t very chatty, they started filtering out Tweets with more than seven words. They also recognized that people sharing links or the size of the earthquake were significantly less likely to be offering firsthand reports, so they filtered out any Tweets sharing a link or a number. Ultimately, this filtered stream proved to be very significant at determining when earthquakes occurred globally.

USGS Modeling Twitter Data to Detect Earthquakes

While I was at the USGS office in Golden, Colo. interviewing Michelle and Paul, three earthquakes happened in a relatively short time. Using Twitter data, their system was able to pick up on an aftershock in Chile within one minute and 20 seconds – and it only took 14 Tweets from the filtered stream to trigger an email alert. The other two earthquakes, off Easter Island and Indonesia, weren’t picked up because they were not widely felt…..

The USGS monitors for earthquakes in many languages, and the words used can be a clue as to the magnitude and location of the earthquake. Chile has two words for earthquakes: terremotoand temblor; terremoto is used to indicate a bigger quake. This one in Chile started with people asking if it was a terremoto, but others realizing that it was a temblor.

As the USGS team notes, Twitter data augments their own detection work on felt earthquakes. If they’re getting reports of an earthquake in a populated area but no Tweets from there, that’s a good indicator to them that it’s a false alarm. It’s also very cost effective for the USGS, because they use Twitter’s Public API and open-source software such as Kibana and ElasticSearch to help determine when earthquakes occur….(More)”

Accelerating Citizen Science and Crowdsourcing to Address Societal and Scientific Challenges


Tom Kalil et al at the White House Blog: “Citizen science encourages members of the public to voluntarily participate in the scientific process. Whether by asking questions, making observations, conducting experiments, collecting data, or developing low-cost technologies and open-source code, members of the public can help advance scientific knowledge and benefit society.

Through crowdsourcing – an open call for voluntary assistance from a large group of individuals – Americans can study and tackle complex challenges by conducting research at large geographic scales and over long periods of time in ways that professional scientists working alone cannot easily duplicate. These challenges include understanding the structure of proteins related viruses in order to support development of new medications, or preparing for, responding to, and recovering from disasters.

…OSTP is today announcing two new actions that the Administration is taking to encourage and support the appropriate use of citizen science and crowdsourcing at Federal agencies:

  1. OSTP Director John Holdren, is issuing a memorandum entitled Addressing Societal and Scientific Challenges through Citizen Science and Crowdsourcing. This memo articulates principles that Federal agencies should embrace to derive the greatest value and impact from citizen science and crowdsourcing projects. The memo also directs agencies to take specific actions to advance citizen science and crowdsourcing, including designating an agency-specific coordinator for citizen science and crowdsourcing projects, and cataloguing citizen science and crowdsourcing projects that are open for public participation on a new, centralized website to be created by the General Services Administration: making it easy for people to find out about and join in these projects.
  2. Fulfilling a commitment made in the 2013 Open Government National Action Plan, the U.S. government is releasing the first-ever Federal Crowdsourcing and Citizen Science Toolkit to help Federal agencies design, carry out, and manage citizen science and crowdsourcing projects. The toolkit, which was developed by OSTP in partnership with the Federal Community of Practice for Crowdsourcing and Citizen Science and GSA’s Open Opportunities Program, reflects the input of more than 125 Federal employees from over 25 agencies on ideas, case studies, best management practices, and other lessons to facilitate the successful use of citizen science and crowdsourcing in a Federal context….(More)”