Syrians discover new use for mobile phones – finding water


Magdalena Mis at Reuters: “Struggling with frequent water cuts, residents of Syria‘s battered city of Aleppo have a new way to find the water needed for their daily lives – an interactive map on mobile phones.

The online map, created by the Red Cross and accessible through mobile phones with 3G technology, helps to locate the closest of over 80 water points across the divided city of 2 million and guides them to it using a Global Positioning System.

“The map is very simple and works on every phone, and everybody now has access to a mobile phone with 3G,” International Committee of the Red Cross (ICRC) spokesman Pawel Krzysiek told the Thomson Reuters Foundation in a phone interview from Damascus on Wednesday.

“The important thing is that it’s not just a map – which many people may not know how to read – it’s the GPS that’s making a difference because people can actually be guided to the water point closest to them,” he said.

Aleppo was Syria’s most populated city and commercial hub before the civil war erupted in 2011, but many areas have been reduced to rubble and the city has been carved up between government forces and various insurgent groups.

Water cuts are a regular occurrence, amounting to about two weeks each month, and the infrastructure is on the brink of collapse, Krzysiek said.

The water supply was restored on Wednesday after a four-day cut caused by damage to the main power line providing electricity to some 80 percent of households, Krzysiek said.

More cuts are likely because fighting is preventing engineers from repairing the power line, and diesel, used for standby generators, may run out, he added….

Krzysiek said the ICRC started working on the map after a simple version created for engineers was posted on its Facebook page in the summer, sparking a wave of comments and requests.

“Suddenly people started to share this map and were sending comments on how to improve it and asking for a new, more detailed one.”

Krzysiek said that about 140,000 people were using the old version of the map and 20,000 had already used the new version, launched on Monday…(More)”

In post-earthquake Nepal, open data accountability


Deepa Rai at the Worldbank blog: “….Following the earthquake, there was an overwhelming response from technocrats and data crunchers to use data visualizations for disaster risk assessment. The Government of Nepal made datasets available through its Disaster Data Portal and many organizations and individuals also pitched in and produced visual data platforms.
However, the use of open data has not been limited to disaster response. It was, and still is, instrumental in tracking how much funding has been received and how it’s being allocated. Through the use of open data, people can make their own analysis based on the information provided online.

Direct Relief, a not-for-profit company, has collected such information and helped gathered data from the Prime Minister’s relief fund and then created infographics which have been useful for media and immediate distribution on social platforms. MapJournal’s visual maps became vital during the Post Disaster Needs Assessment (PDNA) to assess and map areas where relief and reconstruction efforts were urgently needed.

Direct Relief Medical Relief partner locations
Direct Relief medical relief partner locations in context of population affected and injuries by district
Photo Credit: Data Relief Services

Open data and accountability
However, the work of open data doesn’t end with relief distribution and disaster risk assessment. It is also hugely impactful in keeping track of how relief money is pledged, allocated, and spent. One such web application,openenet.net is making this possible by aggregating post disaster funding data from international and national sources into infographics. “The objective of the system,” reads the website “is to ensure transparency and accountability of relief funds and resources to ensure that it reaches to targeted beneficiaries. We believe that transparency of funds in an open and accessible manner within a central platform is perhaps the first step to ensure effective mobilization of available resources.”
Four months after the earthquake, Nepali media have already started to report on aid spending — or the lack of it. This has been made possible by the use of open data from the Ministry of Home Affairs (MoHA) and illustrates how critical data is for the effective use of aid money.
Open data platforms emerging after the quakes have been crucial in questioning the accountability of aid provisions and ultimately resulting in more successful development outcomes….(More)”

Data Science of the People, for the People, by the People: A Viewpoint on an Emerging Dichotomy


Paper by Kush R. Varshney: “This paper presents a viewpoint on an emerging dichotomy in data science: applications in which predictions of datadriven algorithms are used to support people in making consequential decisions that can have a profound effect on other people’s lives and applications in which data-driven algorithms act autonomously in settings of low consequence and large scale. An example of the first type of application is prison sentencing and of the second type is selecting news stories to appear on a person’s web portal home page. It is argued that the two types of applications require data, algorithms and models with vastly different properties along several dimensions, including privacy, equitability, robustness, interpretability, causality, and openness. Furthermore, it is argued that the second type of application cannot always be used as a surrogate to develop methods for the first type of application. To contribute to the development of methods for the first type of application, one must really be working on the first type of application….(More)”

Crowdsourced research: Many hands make tight work


 

Raphael Silberzahn & Eric L. Uhlmann in Nature: “…For many research problems, crowdsourcing analyses will not be the optimal solution. It demands a huge amount of resources for just one research question. Some questions will not benefit from a crowd of analysts: researchers’ approaches will be much more similar for simple data sets and research designs than for large and complex ones. Importantly, crowdsourcing does not eliminate all bias. Decisions must still be made about what hypotheses to test, from where to get suitable data, and importantly, which variables can or cannot be collected. (For instance, we did not consider whether a particular player’s skin tone was lighter or darker than that of most of the other players on his team.) Finally, researchers may continue to disagree about findings, which makes it challenging to present a manuscript with a clear conclusion. It can also be puzzling: the investment of more resources can lead to less-clear outcomes.

“Under the current system, strong storylines win out over messy results.”

Still, the effort can be well worth it. Crowdsourcing research can reveal how conclusions are contingent on analytical choices. Furthermore, the crowdsourcing framework also provides researchers with a safe space in which they can vet analytical approaches, explore doubts and get a second, third or fourth opinion. Discussions about analytical approaches happen before committing to a particular strategy. In our project, the teams were essentially peer reviewing each other’s work before even settling on their own analyses. And we found that researchers did change their minds through the course of analysis.

Crowdsourcing also reduces the incentive for flashy results. A single-team project may be published only if it finds significant effects; participants in crowdsourced projects can contribute even with null findings. A range of scientific possibilities are revealed, the results are more credible and analytical choices that seem to sway conclusions can point research in fruitful directions. What is more, analysts learn from each other, and the creativity required to construct analytical methodologies can be better appreciated by the research community and the public.

Of course, researchers who painstakingly collect a data set may not want to share it with others. But greater certainty comes from having an independent check. A coordinated effort boosts incentives for multiple analyses and perspectives in a way that simply making data available post-publication does not.

The transparency resulting from a crowdsourced approach should be particularly beneficial when important policy issues are at stake. The uncertainty of scientific conclusions about, for example, the effects of the minimum wage on unemployment, and the consequences of economic austerity policies should be investigated by crowds of researchers rather than left to single teams of analysts.

Under the current system, strong storylines win out over messy results. Worse, once a finding has been published in a journal, it becomes difficult to challenge. Ideas become entrenched too quickly, and uprooting them is more disruptive than it ought to be. The crowdsourcing approach gives space to dissenting opinions.

Scientists around the world are hungry for more-reliable ways to discover knowledge and eager to forge new kinds of collaborations to do so. Our first project had a budget of zero, and we attracted scores of fellow scientists with two tweets and a Facebook post.

Researchers who are interested in starting or participating in collaborative crowdsourcing projects can access resources available online. We have publicly shared all our materials and survey templates, and the Center for Open Science has just launched ManyLab, a web space where researchers can join crowdsourced projects….(More).

See also Nature special collection:reproducibility

 

Big Data and Mass Shootings


Holman W. Jenkins in the Wall Street Journal: “As always, the dots are connected after the fact, when the connecting is easy. …The day may be coming, sooner than we think, when such incidents can be stopped before they get started. A software program alerts police to a social-media posting by an individual of interest in their jurisdiction. An algorithm reminds them why the individual had become a person of interest—a history of mental illness, an episode involving a neighbor. Months earlier, discreet inquires by police had revealed an unhealthy obsession with weapons—key word, unhealthy. There’s no reason why gun owners, range operators and firearms dealers shouldn’t be a source of information for local police seeking information about who might merit special attention.

Sound scary? Big data exists to find the signal among the noise. Your data is the noise. It’s what computerized systems seek to disregard in their quest for information that actually would be useful to act on. Big data is interested in needles, not hay.

Still don’t trust the government? You’re barking up an outdated tree. Consider the absurdly ancillary debate last year on whether the government should be allowed to hold telephone “metadata” when the government already holds vastly more sensitive data on all of us in the form of tax, medical, legal and census records.

All this seems doubly silly given the spacious information about each of us contained in private databases, freely bought and sold by marketers. Bizarre is the idea that Facebook should be able to use our voluntary Facebook postings to decide what we might like to buy, but police shouldn’t use the same information to prevent crime.

Hitachi, the big Japanese company, began testing its crime-prediction software in several unnamed American cities this month. The project, called Hitachi Visualization Predictive Crime Analytics, culls crime records, map and transit data, weather reports, social media and other sources for patterns that might otherwise go unnoticed by police.

Colorado-based Intrado, working with LexisNexis and Motorola Solutions, already sells police a service that instantly scans legal, business and social-media records for information about persons and circumstances that officers may encounter when responding to a 911 call at a specific address. Hundreds of public safety agencies find the system invaluable though that didn’t stop the city of Bellingham, Wash., from rejecting it last year on the odd grounds that such software must be guilty of racial profiling.

Big data is changing how police allocate resources and go about fighting crime. …It once was freely asserted that police weren’t supposed to prevent crime, only solve it. But recent research shows investment in policing actually does reduce crime rates—and produces a large positive return measured in dollars and cents. A day will come when failing to connect the dots in advance of a mass-shooting won’t be a matter for upturned hands. It will be a matter for serious recrimination…(More)

Viscous Open Data: The Roles of Intermediaries in an Open Data Ecosystem


François van Schalkwyk, Michelle Willmers & Maurice McNaughton in Journal: “Information Technology for Development”: “Open data have the potential to improve the governance of universities as public institutions. In addition, open data are likely to increase the quality, efficacy and efficiency of the research and analysis of higher education systems by providing a shared empirical base for critical interrogation and reinterpretation. Drawing on research conducted by the Emerging Impacts of Open Data in Developing Countries project, and using an ecosystems approach, this research paper considers the supply, demand and use of open data as well as the roles of intermediaries in the governance of South African public higher education. It shows that government’s higher education database is a closed and isolated data source in the data ecosystem; and that the open data that are made available by government is inaccessible and rarely used. In contrast, government data made available by data intermediaries in the ecosystem are being used by key stakeholders. Intermediaries are found to play several important roles in the ecosystem: (i) they increase the accessibility and utility of data; (ii) they may assume the role of a “keystone species” in a data ecosystem; and (iii) they have the potential to democratize the impacts and use of open data. The article concludes that despite poor data provision by government, the public university governance open data ecosystem has evolved because intermediaries in the ecosystem have reduced the viscosity of government data. Further increasing the fluidity of government open data will improve access and ensure the sustainability of open data supply in the ecosystem….(More)”

We Need Both Networks and Communities


Henry Mintzberg at HBR: “If you want to understand the difference between a network and a community, ask your Facebook friends to help paint your house.

Social media certainly connects us to whoever is on the other end of the line, and so extends our social networks in amazing ways. But this can come at the expense of deeper personal relationships. When it feels like we’re up-to-date on our friends’ lives through Facebook or Instagram, we may become less likely to call them, much less meet up. Networks connect; communities care.

….A century or two ago, the word community “seemed to connote a specific group of people, from a particular patch of earth, who knew and judged and kept an eye on one another, who shared habits and history and memories, and could at times be persuaded to act as a whole on behalf of a part.” In contrast,the word has now become fashionable to describe what are really networks, as in the “business community”—”people with common interests [but] not common values, history, or memory.”

Does this matter for managing in the digital age, even for dealing with our global problems? It sure does. In a 2012 New York Times column, Thomas Friedman reported asking an Egyptian friend about the protest movements in that country: “Facebook really helped people to communicate, but not to collaborate,” he replied. Friedman added that “at their worst, [social media sites] can become addictive substitutes for real action.” That is why, while the larger social movements, as in Cairo’s Tahrir Square or on Wall Street, may raise consciousness about the need for renewal in society, it is the smaller social initiatives, usually developed by small groups in communities, that do much of the renewing….

We tend to make a great fuss about leadership these days, but communityship is more important. The great leaders create, enhance, and support a sense of community in their organizations, and that requires hands-on management. Hence managers have get beyond their individual leadership, to recognize the collective nature of effective enterprise.

Especially for operating around the globe, electronic communication has become essential. But the heart of enterprise remains rooted in personal collaborative relationships, albeit networked by the new information technologies. Thus, in localities and organizations, across societies and around the globe, beware of “networked individualism“ where people communicate readily while struggling to collaborate.

The new digital technologies, wonderful as they are in enhancing communication, can have a negative effect on collaboration unless they are carefully managed. An electronic device puts us in touch with a keyboard, that’s all….(More)”

A new model to explore non-profit social media use for advocacy and civic engagement


David Chapman, Katrina Miller-Stevens, John C Morris, and Brendan O’Hallarn in First Monday: “In an age when electronic communication is ubiquitous, non-profit organizations are actively using social media platforms as a way to deliver information to end users. In spite of the broad use of these platforms, little scholarship has focused on the internal processes these organizations employ to implement these tools. A limited number of studies offer models to help explain an organization’s use of social media from initiation to outcomes, yet few studies address a non-profit organization’s mission as the driver to employ social media strategies and tactics. Furthermore, the effectiveness of social media use is difficult for non-profit organizations to measure. Studies that attempt to address this question have done so by viewing social media platform analytics (e.g., Facebook analytics) or analyzing written content by users of social media (Nah and Saxton, 2013; Auger, 2013; Uzunoğlu and Misci Kip, 2014; or Guo and Saxton, 2014). The value added of this study is to present a model for practice (Weil, 1997) that explores social media use and its challenges from a non-profit organization’s mission through its desired outcome, in this case an outcome of advocacy and civic engagement.

We focus on one non-profit organization, Blue Star Families, that actively engages in advocacy and civic engagement. Blue Star Families was formed in 2009 to “raise the awareness of the challenges of military family life with our civilian communities and leaders” (Blue Star Families, 2010). Blue Star Families is a virtual organization with no physical office location. Thus, the organization relies on its Web presence and social media tools to advocate for military families and engage service members and their families, communities, and citizens in civic engagement activities (Blue Star Families, 2010).

The study aims to provide organizational-level insights of the successes and challenges of working in the social media environment. Specifically, the study asks: What are the processes non-profit organizations follow to link organizational mission to outcomes when using social media platforms? What are the successes and challenges of using social media platforms for advocacy and civic engagement purposes? In our effort to answer these questions, we present a new model to explore non-profit organizations’ use of social media platforms by building on previous models and frameworks developed to explore the use of social media in the public, private, and non-profit sectors.

This research is important for three reasons. First, most previous studies of social media tend to employ models that focus on the satisfaction of the social media tools for organizational members, rather than the utility of social media as a tool to meet organizational goals. Our research offers a means to explore the utility of social media from an organization perspective. Second, the exemplar case for our research, Blue Star Families, Inc., is a non-profit organization whose mission is to create and nurture a virtual community spread over a large geographical — if not global — area. Because Blue Star Families was founded as an online organization that could not exist without social media, it provides a case for which social media is a critical component of the organization’s activity. Finally, we offer some “lessons learned” from our case to identify issues for other organizations seeking to create a significant social media presence.

This paper is organized as follows: first, the growth of social media is briefly addressed to provide background context. Second, previous models and frameworks exploring social media are discussed. This is followed by a presentation of a new model exploring the use of social media from an organizational perspective, starting with the driver of a non-profit organization’s mission, to its desired outcomes of advocacy and civic engagement. Third, the case study methodology is explained. Next, we present an analysis and discussion applying the new model to Blue Star Families’ use of social media platforms. We conclude by discussing the challenges of social media revealed in the case study analysis, and we offer recommendations to address these challenges….(More)”

How the USGS uses Twitter data to track earthquakes


Twitter Blog: “After the disastrous Sichuan earthquake in 2008, people turned to Twitter to share firsthand information about the earthquake. What amazed many was the impression that Twitter was faster at reporting the earthquake than the U.S. Geological Survey (USGS), the official government organization in charge of tracking such events.

This Twitter activity wasn’t a big surprise to the USGS. The USGS National Earthquake Information Center (NEIC) processes about 2,000 realtime earthquake sensors, with the majority based in the United States. That leaves a lot of empty space in the world with no sensors. On the other hand, there are hundreds of millions of people using Twitter who can report earthquakes. At first, the USGS staff was a bit skeptical that Twitter could be used as a detection system for earthquakes – but when they looked into it, they were surprised at the effectiveness of Twitter data for detection.

USGS staffers Paul Earle, a seismologist, and Michelle Guy, a software developer, teamed up to look at how Twitter data could be used for earthquake detection and verification. By using Twitter’s Public API, they decided to use the same time series event detection method they use when detecting earthquakes. This gave them a baseline for earthquake-related chatter, but they decided to dig in even further. They found that people Tweeting about actual earthquakes kept their Tweets really short, even just to ask, “earthquake?” Concluding that people who are experiencing earthquakes aren’t very chatty, they started filtering out Tweets with more than seven words. They also recognized that people sharing links or the size of the earthquake were significantly less likely to be offering firsthand reports, so they filtered out any Tweets sharing a link or a number. Ultimately, this filtered stream proved to be very significant at determining when earthquakes occurred globally.

USGS Modeling Twitter Data to Detect Earthquakes

While I was at the USGS office in Golden, Colo. interviewing Michelle and Paul, three earthquakes happened in a relatively short time. Using Twitter data, their system was able to pick up on an aftershock in Chile within one minute and 20 seconds – and it only took 14 Tweets from the filtered stream to trigger an email alert. The other two earthquakes, off Easter Island and Indonesia, weren’t picked up because they were not widely felt…..

The USGS monitors for earthquakes in many languages, and the words used can be a clue as to the magnitude and location of the earthquake. Chile has two words for earthquakes: terremotoand temblor; terremoto is used to indicate a bigger quake. This one in Chile started with people asking if it was a terremoto, but others realizing that it was a temblor.

As the USGS team notes, Twitter data augments their own detection work on felt earthquakes. If they’re getting reports of an earthquake in a populated area but no Tweets from there, that’s a good indicator to them that it’s a false alarm. It’s also very cost effective for the USGS, because they use Twitter’s Public API and open-source software such as Kibana and ElasticSearch to help determine when earthquakes occur….(More)”

Web design plays a role in how much we reveal online


European Commission: “A JRC study, “Nudges to Privacy Behaviour: Exploring an Alternative Approach to Privacy Notices“, used behavioural sciences to look at how individuals react to different types of privacy notices. Specifically, the authors analysed users’ reactions to modified choice architecture (i.e. the environment in which decisions take place) of web interfaces.

Two types of privacy behaviour were measured: passive disclosure, when people unwittingly disclose personal information, and direct disclosure, when people make an active choice to reveal personal information. After testing different designs with over 3 000 users from the UK, Italy, Germany and Poland, results show web interface affects decisions on disclosing personal information. The study also explored differences related to country of origin, gender, education level and age.

A depiction of a person’s face on the website led people to reveal more personal information. Also, this design choice and the visualisation of the user’s IP or browsing history had an impact on people’s awareness of a privacy notice. If confirmed, these features are particularly relevant for habitual and instinctive online behaviour.

With regard to education, users who had attended (though not necessarily graduated from) college felt significantly less observed or monitored and more comfortable answering questions than those who never went to college. This result challenges the assumption that the better educated are more aware of information tracking practices. Further investigation, perhaps of a qualitative nature, could help dig deeper into this issue. On the other hand, people with a lower level of education were more likely to reveal personal information unwittingly. This behaviour appeared to be due to the fact that non-college attendees were simply less aware that some online behaviour revealed personal information about themselves.

Strong differences between countries were noticed, indicating a relation between cultures and information disclosure. Even though participants in Italy revealed the most personal information in passive disclosure, in direct disclosure they revealed less than in other countries. Approximately 75% of participants in Italy chose to answer positively to at least one stigmatised question, compared to 81% in Poland, 83% in Germany and 92% in the UK.

Approximately 73% of women answered ‘never’ to the questions asking whether they had ever engaged in socially stigmatised behaviour, compared to 27% of males. This large difference could be due to the nature of the questions (e.g. about alcohol consumption, which might be more acceptable for males). It could also suggest women feel under greater social scrutiny or are simply more cautious when disclosing personal information.

These results could offer valuable insights to inform European policy decisions, despite the fact that the study has targeted a sample of users in four countries in an experimental setting. Major web service providers are likely to have extensive amounts of data on how slight changes to their services’ privacy controls affect users’ privacy behaviour. The authors of the study suggest that collaboration between web providers and policy-makers can lead to recommendations for web interface design that allow for conscientious disclosure of privacy information….(More)”