Another Tale of Two Cities: Understanding Human Activity Space Using Actively Tracked Cellphone Location Data


Paper by Yang Xu et al: “Activity space is an important concept in geography. Recent advancements of location-aware technologies have generated many useful spatiotemporal data sets for studying human activity space for large populations. In this article, we use two actively tracked cellphone location data sets that cover a weekday to characterize people’s use of space in Shanghai and Shenzhen, China. We introduce three mobility indicators (daily activity range, number of activity anchor points, and frequency of movements) to represent the major determinants of individual activity space. By applying association rules in data mining, we analyze how these indicators of an individual’s activity space can be combined with each other to gain insights of mobility patterns in these two cities. We further examine spatiotemporal variations of aggregate mobility patterns in these two cities. Our results reveal some distinctive characteristics of human activity space in these two cities: (1) A high percentage of people in Shenzhen have a relatively short daily activity range, whereas people in Shanghai exhibit a variety of daily activity ranges; (2) people with more than one activity anchor point tend to travel further but less frequently in Shanghai than in Shenzhen; (3) Shenzhen shows a significant north–south contrast of activity space that reflects its urban structure; and (4) travel distance in both cities is shorter around noon than in regular work hours, and a large percentage of movements around noon are associated with individual home locations. This study indicates the benefits of analyzing actively tracked cellphone location data for gaining insights of human activity space in different cities….(More)”

Donating Your Selfies to Science


Linda Poon at CityLab: “It’s not only your friends and family who follow your online selfies and group photos. Scientists are starting to look at them, too, though they’re more interested in what’s around you. In bulk, photos can reveal weather patterns across multiple locations, air quality of a place over time, the dynamics of a neighborhood—all sorts of information that helps researchers study cities.

At the Nanyang Technological University in Singapore, a research group is using crowdsourced photos to create a low-cost alternative to air-pollution sensors. Called AirTick, the smartphone app they’ve designed will collect photos from users and analyze how hazy the environment looks. It’ll then check each image against official air quality data, and through machine-learning the app will eventually be able to predict pollution levels based on an image alone.

AirTick creator Pan Zhengziang said in a promotional video last month that the growing concern among the public over air quality can make programs like this a success—especially in Southeast Asia, where smog has gotten so bad that governments have had to shut down schools and suspend outdoor activities.  “In Singapore’s recent haze episode, around 250,000 people [have] shared their concerns via Twitter,” he said. “This has made crowdsourcing-based air quality monitoring a possibility.”…(More)”

How Citizen Science Changed the Way Fukushima Radiation is Reported


Ari Beser at National Geographic: “It appears the world-changing event didn’t change anything, and it’s disappointing,”said Pieter Franken, a researcher at Keio University in Japan (Wide Project), the MIT Media Lab (Civic Media Centre), and co-founder of Safecast, a citizen-science network dedicated to the measurement and distribution of accurate levels of radiation around the world, especially in Fukushima. “There was a chance after the disaster for humanity to innovate our thinking about energy, and that doesn’t seem like it’s happened.  But what we can change is the way we measure the environment around us.”

Franken and his founding partners found a way to turn their email chain, spurred by the tsunami, into Safecast; an open-source network that allows everyday people to contribute to radiation-monitoring.

“We literally started the day after the earthquake happened,” revealed Pieter. “A friend of mine, Joi Ito, the director of MIT Media Lab, and I were basically talking about what Geiger counter to get. He was in Boston at the time and I was here in Tokyo, and like the rest of the world, we were worried, but we couldn’t get our hands on anything. There’s something happening here, we thought. Very quickly as the disaster developed, we wondered how to get the information out. People were looking for information, so we saw that there was a need. Our plan became: get information, put it together and disseminate it.”

An e-mail thread between Franken, Ito, and Sean Bonner, (co-founder of CRASH Space, a group that bills itself as Los Angeles’ first hackerspace), evolved into a network of minds, including members of Tokyo Hackerspace, Dan Sythe, who produced high-quality Geiger counters, and Ray Ozzie, Microsoft’s former Chief Technical Officer. On April 15, the group that was to become Safecast sat down together for the first time. Ozzie conceived the plan to strap a Geiger counter to a car and somehow log measurements in motion. This would became the bGeigie, Safecast’s future model of the do-it-yourself Geiger counter kit.

Armed with a few Geiger counters donated by Sythe, the newly formed team retrofitted their radiation-measuring devices to the outside of a car.  Safecast’s first volunteers drove up to the city of Koriyama in Fukushima Prefecture, and took their own readings around all of the schools. Franken explained, “If we measured all of the schools, we covered all the communities; because communities surround schools. It was very granular, the readings changed a lot, and the levels were far from academic, but it was our start. This was April 24, 6 weeks after the disaster. Our thinking changed quite a bit through this process.”

DSC_0358
With the DIY kit available online, all anyone needs to make their own Geiger counter is a soldering iron and the suggested directions.

Since their first tour of Koriyama, with the help of a successful Kickstarter campaign, Safecast’s team of volunteers have developed the bGeigie handheld radiation monitor, that anyone can buy on Amazon.com and construct with suggested instructions available online. So far over 350 users have contributed 41 million readings, using around a thousand fixed, mobile, and crowd-sourced devices….(More)

Open data and (15 million!) new measures of democracy


Joshua Tucker in the Washington Post: “Last month the University of Gothenberg’s V-Dem Institute released a new“Varieties of Democracy” dataset. It provides about 15 million data points on democracy, including 39 democracy-related indices. It can be accessed at v-dem.net along with supporting documentation. I asked Staffan I. Lindberg, Director of the V-Dem Institute and one of the directors of the project, a few questions about the new data. What follows is a lightly edited version of his answers.


Women’s Political Empowerment Index for Southeast Asia (Data: V-Dem data version 5; Figure V-Dem Institute, University of Gothenberg, Sweden)

Joshua Tucker: What is democracy, and is it even really to have quantitative measures on democracy?

Staffan Lindberg: There is no consensus on the definition of democracy and how to measure it. The understanding of what a democracy really is varies across countries and regions. This motivates the V-Dem approach not to offer one standard definition of the concept but instead to distinguish among five principles different versions of democracy: Electoral, Liberal, Participatory, Deliberative, and Egalitarian democracy. All of these principles have played prominent roles in current and historical discussions about democracy. Our measurement of these principles are based on two types of data, factual data collected by assisting researchers and survey responses by country experts, which are combined using a rather complex measurement model (which is a“custom-designed Bayesian ordinal item response theory model”, for details see the V-Dem Methodology document)….(More)

Women Also Know Stuff


WomenAlsoKnowStuff: “So often while planning a conference, brainstorming a list of speakers, or searching for experts to cite or interview, it can be difficult to think of any scholars who aren’t male. We’ve all been there… you just know that a woman has got to be studying that topic… but who?

This site is intended to provide an easily accessible database of female experts in a variety of areas.

This site was created and is maintained by political scientists and, as such, focuses on politics, policy, and government, but also on methods in the social sciences.(We’re certain that women know stuff in other fields too, though!)

Please submit your information to WomenAlsoKnowStuff using the GoogleForm linked below: http://bit.do/womenalsoknow…”

Research Consortium on the Impact of Open Government Processes


Image

“Mounting anecdotal evidence supports the case for open government. Sixty-nine national governments andcounting have signed on as participants in the Open Government Partnership, committing to rethinking theway they engage with citizens, while civil society organizations (CSOs) are increasingly demanding andbuilding mechanisms for this shift.Yet even as the open government agenda gains steam, relatively littlesystematic research has been done to examine the ways different types and sequences of reforms haveplayed out in various contexts, and with what impact. This is due in part to the newness of the field, but alsoto the challenges in attributing specific outcomes to any governance initiative. While acknowledging that thesearch for cookie-cutter “best practices” is of limited value, there is no doubt that reform-minded actorscould benefit from a robust analytical framework and more thorough understanding of experiences indifferent contexts to date.

To address these knowledge gaps, and to sharpen our ways of thinking about the difference that opengovernment processes can make, a range of public, academic, and advocacy organizations established aresearch consortium to convene actors, leverage support, and catalyze research. Its founding members areGlobal Integrity,The Governance Lab @ NYU (The GovLab), the World Bank’s Open Government GlobalSolutions Group, Open Government Partnership Support Unit, and Results for Development Institute. TheConsortium aims to build on existing research – including but not limited to the work of existing researchnetworks such as the MacArthur Foundation Research Network on Opening Governance – to improve ourunderstanding of the effectiveness and impact of open government reforms. That is, to what extent andthrough which channels do such reforms actually improve transparency, accessibility, and accountability; how does this play out differently in different contexts; and can we trace tangible improvements in the livesof citizens to these measures…..

Countries participating in the Open Government Partnership have signed on to the view that opengovernment is intrinsically good in terms of strengthening civic participation and democratic processes.Governments are also increasingly looking at such initiatives through a return-on-investment (ROI) lens: dosuch reforms lead to cost savings that allow them to allocate and spend resources more efficiently on publicservices? Does the availability and accessibility of open government data create economic opportunities,including jobs and new businesses? The Consortium is excited to support innovative research aimed atunderstanding the extent to which reforms deliver, not only in terms of open governance itself, but also interms of improved public sector performance and service delivery gains. This focus will also help theConsortium identify research-driven stories of the impact that open governance reforms are having….(More)”

Big data’s big role in humanitarian aid


Mary K. Pratt at Computerworld: “Hundreds of thousands of refugees streamed into Europe in 2015 from Syria and other Middle Eastern countries. Some estimates put the number at nearly a million.

The sheer volume of people overwhelmed European officials, who not only had to handle the volatile politics stemming from the crisis, but also had to find food, shelter and other necessities for the migrants.

Sweden, like many of its European Union counterparts, was taking in refugees. The Swedish Migration Board, which usually sees 2,500 asylum seekers in an average month, was accepting 10,000 per week.

“As you can imagine, with that number, it requires a lot of buses, food, registration capabilities to start processing all the cases and to accommodate all of those people,” says Andres Delgado, head of operational control, coordination and analysis at the Swedish Migration Board.

Despite the dramatic spike in refugees coming into the country, the migration agency managed the intake — hiring extra staff, starting the process of procuring housing early, getting supplies ready. Delgado credits a good part of that success to his agency’s use of big data and analytics that let him predict, with a high degree of accuracy, what was heading his way.

“Without having that capability, or looking at the tool every day, to assess every need, this would have crushed us. We wouldn’t have survived this,” Delgado says. “It would have been chaos, actually — nothing short of that.”

The Swedish Migration Board has been using big data and analytics for several years, as it seeks to gain visibility into immigration trends and what those trends will mean for the country…./…

“Can big data give us peace? I think the short answer is we’re starting to explore that. We’re at the very early stages, where there are shining examples of little things here and there. But we’re on that road,” says Kalev H. Leetaru, creator of the GDELT Project, or the Global Database of Events, Language and Tone, which describes itself as a comprehensive “database of human society.”

The topic is gaining traction. A 2013 report, “New Technology and the Prevention of Violence and Conflict,” from the International Peace Institute, highlights uses of telecommunications technology, including data, in several crisis situations around the world. The report emphasizes the potential these technologies hold in helping to ease tensions and address problems.

The report’s conclusion offers this idea: “Big data can be used to identify patterns and signatures associated with conflict — and those associated with peace — presenting huge opportunities for better-informed efforts to prevent violence and conflict.”

That’s welcome news to Noel Dickover. He’s the director of PeaceTech Data Networks at the PeaceTech Lab, which was created by the U.S. Institute of Peace (USIP) to advance USIP’s work on how technology, media and data help reduce violent conflict around the world.

Such work is still in the nascent stages, Dickover says, but people are excited about its potential. “We have unprecedented amounts of data on human sentiment, and we know there’s value there,” he says. “The question is how to connect it.”

Dickover is working on ways to do just that. One example is the Open Situation Room Exchange (OSRx) project, which aims to “empower greater collective impact in preventing or mitigating serious violent conflicts in particular arenas through collaboration and data-sharing.”…(More)

Smarter State Case Studies


“Just as individuals use only part of their brainpower to solve most problems, governing institutions make far too little use of the skills and experience of those inside and outside of government with scientific credentials, practical skills, and ground-level street smarts. New data-rich tools—what The GovLab calls technologies of expertise—are making it possible to match the supply of citizen and civil servant talent to the demand for it in government to solve problems.

The Smarter State Case Studies examine how public institutions are using technologies of expertise, including:

Talent Bank – Professional, social and knowledge networks
Collaboration – Platforms for group work across silos
Project Platforms – Places for inviting new participants to work on projects
Toolkits – Repositories for shared content

Explore the design and key features of these novel platforms; how they are being implemented; the challenges encountered by both creators and users and the anticipated impact of these new ways of working.
The case studies can be found at http://www.thegovlab.org/smarterstate.html
To share a case study, please contact: maria@thegovlab.org

Improving government effectiveness: lessons from Germany


Tom Gash at Global Government Forum: “All countries face their own unique challenges but advanced democracies also have much in common: the global economic downturn, aging populations, increasingly expensive health and pension spending, and citizens who remain as hard to please as ever.

At an event last week in Bavaria, attended by representatives of Bavaria’s governing party, the Christian Social Union (CSU) and their guests, it also became clear that there is a growing consensus that governments face another common problem. They have relied for too long on traditional legislation and regulation to drive change. The consensus was that simply prescribing in law what citizens and companies can and can’t do will not solve the complex problems governments are facing, that governments cannot legislate their way to improved citizen health, wealth and wellbeing….

…a number of developments …from which both UK and international policymakers and practitioners can learn to improve government effectiveness.

  1. Behavioural economics: The Behavioural Insights Team (BIT), which span out of government in 2013 and is the subject of a new book by one of its founders and former IfG Director of Research, David Halpern, is being watched carefully by many countries abroad. Some are using its services, while others – including the New South Wales Government in Australia –are building their own skills in this area. BIT and others using similar principles have shown that using insights from social psychology – alongside an experimental approach – can help save money and improve outcomes. Well known successes include increasing the tax take through changing wording of reminder letters (work led by another IfG alumni Mike Hallsworth) and increasing pension take-up through auto-enrolment.
  2. Market design: There is an emerging field of study which is examining how algorithms can be used to match people better with services they need – particularly in cases where it is unfair or morally repugnant to let allow a free market to operate. Alvin Roth, the Harvard Professor and Nobel prize winner, writes about these ‘matching markets’ in his book Who Gets What and Why – in which he also explains how the approach can ensure that more kidneys reach compatible donors, and children find the right education.
  3. Big data: Large datasets can now be mined far more effectively, whether it is to analyse crime patterns to spot where police patrols might be useful or to understand crowd flows on public transport. The use of real-time information allows far more sophisticated deployment of public sector resources, better targeted at demand and need, and better tailored to individual preferences.
  4. Transparency: Transparency has the potential to enhance both the accountability and effectiveness of governments across the world – as shown in our latest Whitehall Monitor Annual Report. The UK government is considered a world-leader for its transparency – but there are still areas where progress has stalled, including in transparency over the costs and performance of privately provided public services.
  5. New management models: There is a growing realisation that new methods are best harnessed when supported by effective management. The Institute’s work on civil service reform highlights a range of success factors from past reforms in the UK – and the benefits of clear mechanisms for setting priorities and sticking to them, as is being attempted by governments new(ish) Implementation Taskforces and the Departmental Implementation Units currently cropping up across Whitehall. I looked overseas for a different model that clearly aligns government activities behind citizens’ concerns – in this case the example of the single non-emergency number system operating in New York City and elsewhere. This system supports a powerful, highly responsive, data-driven performance management regime. But like many performance management regimes it can risk a narrow and excessively short-term focus – so such tools must be combined with the mind-set of system stewardship that the Institute has long championed in its policymaking work.
  6. Investment in new capability: It is striking that all of these developments are supported by technological change and research insights developed outside government. But to embed new approaches in government, there appear to be benefits to incubating new capacity, either in specialist departmental teams or at the centre of government….(More)”

The Point of Collection


Essay by Mimi Onuoha: “The conceptual, practical, and ethical issues surrounding “big data” and data in general begin at the very moment of data collection. Particularly when the data concern people, not enough attention is paid to the realities entangled within that significant moment and spreading out from it.

I try to do some disentangling here, through five theses around data collection — points that are worth remembering, communicating, thinking about, dwelling on, and keeping in mind, if you have anything to do with data on a daily basis (read: all of us) and want to do data responsibly.

1. Data sets are the results of their means of collection.

It’s easy to forget that the people collecting a data set, and how they choose to do it, directly determines the data set….

2. As we collect more data, we prioritize things that fit patterns of collection.

Or as Rob Kitchin and Martin Dodge say in Code/Space,“The effect of abstracting the world is that the world starts to structure itself in the image of the capta and the code.” Data emerges from a world that is increasingly software-mediated, and software thrives on abstraction. It flattens out individual variations in favor of types and models….

3. Data sets outlive the rationale for their collection.

Spotify can come up with a list of reasons why having access to users’ photos, locations, microphones, and contact lists can improve the music streaming experience. But the reasons why they decide these forms of data might be useful can be less important than the fact that they have the data itself. This is because the needs or desires influencing the decisions to collect some type of data often eventually disappear, while the data produced as a result of those decisions have the potential to live for much longer. The data are capable of shifting and changing according to specific cultural contexts and to play different roles than what they might have initially been intended for….

4. Corollary: Especially combined, data sets reveal far more than intended.

We sometimes fail to realize that data sets, both on their own and combined with others, can be used to do far more than what they were originally intended for. You can make inferences from one data set that result in conclusions in completely different realms. Facebook, by having huge amounts of data on people and their networks, could make reasonable hypotheses regarding people’s sexual orientations….

5. Data collection is a transaction that is the result of an invisible relationship.

This is a frame — connected to my first point — useful for understanding how to think about data collection on the whole:

Every data set involving people implies subjects and objects, those who collect and those who make up the collected. It is imperative to remember that on both sides we have human beings….(More)”