Who Are You Calling Irrational?


New paper by Aneil Kovvali: “Cass Sunstein is the leading advocate of “nudges” – small policy interventions that yield major impacts because of behavioral quirks in the way that people process information. Such interventions form the core of Sunstein’s philosophy of “libertarian paternalism,” which seeks to improve on individuals’ decisions while preserving their freedom to choose. In “Why Nudge?”, Sunstein forcefully defends libertarian paternalism against John Stuart Mill’s famous Harm Principle, which holds that government should only coerce a person when it is acting to prevent harm to others. Sunstein urges that unlike more coercive measures, nudges respect subjects’ goals, even as they reshape their choices. Using an analogy to voting paradoxes, this review shows that reconciling multiple, inconsistent goals is a fundamentally challenging problem; the challenge leaves even deliberative individuals vulnerable to manipulation through nudges. The fact of inconsistent goals means that government regulators who deploy nudges select and impose their own objectives, instead of merely advancing the goals of the regulated. The analogy also highlights that multimember legislative bodies are subject to many of the same quirks as individuals, raising questions about the government’s ability to improve on individuals’ choices….(More)”

 

Quantifying Crowd Size with Mobile Phone and Twitter Data


, , and Being able to infer the number of people in a specific area is of extreme importance for the avoidance of crowd disasters and to facilitate emergency evacuations. Here, using a football stadium and an airport as case studies, we present evidence of a strong relationship between the number of people in restricted areas and activity recorded by mobile phone providers and the online service Twitter. Our findings suggest that data generated through our interactions with mobile phone networks and the Internet may allow us to gain valuable measurements of the current state of society….(More)”

How can we ensure that cities create opportunities for healthy urbanization?


Blog by Roy Ahn, Thomas F. Burke & Anita M. McGahan on their new book: “By the year 2100, 8 out of 10 people in the world will reside in cities – a major change in demographics compared to 100 years ago.

Urbanization has sweeping consequences for population health. Most analysts evaluate the “specter of urbanization” by focusing on problems and challenges, which can include slum development, insecurity, and inequality.

As the World Health Organization and UN Habitat note in their seminal report, Hidden Cities, “Cities concentrate opportunities, jobs and services, but they also concentrate risks and hazards for health.” The urban poor are especially vulnerable because their housing conditions and access to clean water, sanitation, and health care are often severely compromised.

Additionally, the jobs available to the urban poor are often informal, dangerous, and temporary. Yet the lack of integrated governance and infrastructure responsible for urbanization problems also can create remarkable and often untapped opportunities for improving health. How can we ensure that cities create opportunities for healthy urbanization?

In our new book, Innovating for Healthy Urbanization, we argue that using the “innovations” lens can provide a unique platform through which solutions for urbanization and health can emerge.

Sometimes “innovations” can be decidedly high tech, such as holograms on medication packaging that protect against drug counterfeiters, or tiny filter paper tests costing pennies that exponentially increase access to medical diagnostic testing for poor people living in cities.

Other innovations are less tech-focused, but equally impactful, such as advocating for motorcycle helmet laws in cities or a low-cost, condom catheter-balloon kit that can save mothers from dying from postpartum hemorrhage.

What makes both high- and low-tech solutions effective? Pushing the envelope on what works and then integrating solutions to meet a community’s priority needs…..(More)”

Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism


Paper by Stefan Baack at Big Data and Society: “This article shows how activists in the open data movement re-articulate notions of democracy, participation, and journalism by applying practices and values from open source culture to the creation and use of data. Focusing on the Open Knowledge Foundation Germany and drawing from a combination of interviews and content analysis, it argues that this process leads activists to develop new rationalities around datafication that can support the agency of datafied publics. Three modulations of open source are identified: First, by regarding data as a prerequisite for generating knowledge, activists transform the sharing of source code to include the sharing of raw data. Sharing raw data should break the interpretative monopoly of governments and would allow people to make their own interpretation of data about public issues. Second, activists connect this idea to an open and flexible form of representative democracy by applying the open source model of participation to political participation. Third, activists acknowledge that intermediaries are necessary to make raw data accessible to the public. This leads them to an interest in transforming journalism to become an intermediary in this sense. At the same time, they try to act as intermediaries themselves and develop civic technologies to put their ideas into practice. The article concludes with suggesting that the practices and ideas of open data activists are relevant because they illustrate the connection between datafication and open source culture and help to understand how datafication might support the agency of publics and actors outside big government and big business….(More)

Crowdsourcing: a survey of applications


Paper by Jayshri Namdeorao Ganthade, Sunil R. Gupta: “Crowdsourcing, itself a multidisciplinary field, can be well-served by incorporating theories and methods from affective computing. We present a various applications which are based on crowdsourcing. The direction of research on principles and methods can enable to solve a general problem via human computation systems. Crowdsourcing is nothing but an act of outsourcing tasks to a large group of people through an open request via the Internet. It has become popular among social scientists as a source to recruit research participants from the general public for studies. Crowdsourcing is introduced as the new online distributed problem solving model in which networked people collaborate to complete a task and produce the result. However, the idea of crowdsourcing is not new, and can be traced back to Charles Darwin. Darwin was interested in studying the universality of facial expressions in conveying emotions. For this, it required large amount of database and for this he had to consider a global population to get more general conclusions.
This paper provides an introduction to crowdsourcing, guidelines for using crowdsourcing, and its applications in various fields. Finally, this article proposes conclusion which is based upon applications of crowdsourcing….(More)”.

 

Digital government evolution: From transformation to contextualization


Paper by Tomasz Janowski in the Government Information Quarterly: “The Digital Government landscape is continuously changing to reflect how governments are trying to find innovative digital solutions to social, economic, political and other pressures, and how they transform themselves in the process. Understanding and predicting such changes is important for policymakers, government executives, researchers and all those who prepare, make, implement or evaluate Digital Government decisions. This article argues that the concept of Digital Government evolves toward more complexity and greater contextualization and specialization, similar to evolution-like processes that lead to changes in cultures and societies. To this end, the article presents a four-stage Digital Government Evolution Model comprising Digitization (Technology in Government), Transformation (Electronic Government), Engagement (Electronic Governance) and Contextualization (Policy-Driven Electronic Governance) stages; provides some evidence in support of this model drawing upon the study of the Digital Government literature published in Government Information Quarterly between 1992 and 2014; and presents a Digital Government Stage Analysis Framework to explain the evolution. As the article consolidates a representative body of the Digital Government literature, it could be also used for defining and integrating future research in the area….(More)”

Innovation Experiments: Researching Technical Advance, Knowledge Production and the Design of Supporting Institutions


Paper by Kevin J. Boudreau and Karim Lakhani: “This paper discusses several challenges in designing field experiments to better understand how organizational and institutional design shapes innovation outcomes and the production of knowledge. We proceed to describe the field experimental research program carried out by our Crowd Innovation Laboratory at Harvard University to clarify how we have attempted to address these research design challenges. This program has simultaneously solved important practical innovation problems for partner organizations, like NASA and Harvard Medical School, while contributing research advances, particularly in relation to innovation contests and tournaments….(More)

Citizen Sensor Data Mining, Social Media Analytics and Applications


Paper by Amit P. Sheth: “With the rapid rise in the popularity of social media (1B+ Facebook users, 200M+ twitter users), and near ubiquitous mobile access (4+ billion actively-used mobile phones), the sharing of observations and opinions has become common-place (500M+ tweets a day). This has given us an unprecedented access to the pulse of a populace and the ability to perform analytics on social data to support a variety of socially intelligent applications — be it for brand tracking and management, crisis coordination, organizing revolutions or promoting social development in underdeveloped and developing countries. I will review: 1) understanding and analysis of informal text, esp. microblogs (e.g., issues of cultural entity extraction and role of semantic/background knowledge enhanced techniques), and 2) how we built Twitris, a comprehensive social media analytics (social intelligence) platform. I will describe the analysis capabilities along three dimensions: spatio-temporal-thematic, people-content-network, and sentiment-emption-intent. I will couple technical insights with identification of computational techniques and real-world examples using live demos of Twitris….(More)”

The data or the hunch


Ian Leslie at Intelligent Life: “THE GIFT FOR talent-spotting is mysterious, highly prized and celebrated. We love to hear stories about the baseball coach who can spot the raw ability of an erratic young pitcher, the boss who sees potential in the guy in the post room, the director who picks a soloist out of the chorus line. Talent shows are a staple of the TV schedules. We like to believe that certain people—sometimes ourselves—can just sense when a person has something special. But there is another method of spotting talent which doesn’t rely on hunches. In place of intuition, it offers data and analysis. Rather than relying on the gut, it invites us to use our heads. It tends not to make for such romantic stories, but it is effective—which is why, despite our affection, the hunch is everywhere in retreat.

Strike one against the hunch was the publication of “Moneyball” by Michael Lewis (2003), which has attained the status of a management manual for many in sport and beyond. Lewis reported on a cash-strapped major-league baseball team, the Oakland A’s, who enjoyed unlikely success against bigger and better-funded competitors. Their secret sauce was data. Their general manager, Billy Beane, had realised that when it came to evaluating players, the gut instincts of experienced baseball scouts were unreliable, and he employed statisticians to identify talent overlooked by the big clubs…..

These days, when a football club is interested in a player, it considers the average distance he runs in a game, the number of passes and tackles or blocks he makes, his shots on goal, the ratio of goals to shots, and many other details nobody thought to measure a generation ago. Sport is far from the only industry in which talent-spotting is becoming a matter of measurement. Prithwijit Mukerji, a postgraduate at the University of Westminster in London, recently published a paper on the way the music industry is being transformed by “the Moneyball approach”. By harvesting data from Facebook and Twitter and music services like Spotify and Shazam, executives can track what we are listening to in far more detail than ever before, and use it as a guide to what we will listen to next….

This is the day of the analyst. In education, academics are working their way towards a reliable method of evaluating teachers, by running data on test scores of pupils, controlled for factors such as prior achievement and raw ability. The methodology is imperfect, but research suggests that it’s not as bad as just watching someone teach. A 2011 study led by Michael Strong at the University of California identified a group of teachers who had raised student achievement and a group who had not. They showed videos of the teachers’ lessons to observers and asked them to guess which were in which group. The judges tended to agree on who was effective and ineffective, but, 60% of the time, they were wrong. They would have been better off flipping a coin. This applies even to experts: the Gates Foundation funded a vast study of lesson observations, and found that the judgments of trained inspectors were highly inconsistent.

THE LAST STRONGHOLD of the hunch is the interview. Most employers and some universities use interviews when deciding whom to hire or admit. In a conventional, unstructured interview, the candidate spends half an hour or so in a conversation directed at the whim of the interviewer. If you’re the one deciding, this is a reassuring practice: you feel as if you get a richer impression of the person than from the bare facts on their résumé, and that this enables you to make a better decision. The first theory may be true; the second is not.

Decades of scientific evidence suggest that the interview is close to useless as a tool for predicting how someone will do a job. Study after study has found that organisations make better decisions when they go by objective data, like the candidate’s qualifications, track record and performance in tests. “The assumption is, ‘if I meet them, I’ll know’,” says Jason Dana, of Yale School of Management, one of many scholars who have looked into the interview’s effectiveness. “People are wildly over-confident in their ability to do this, from a short meeting.” When employers adopt a holistic approach, combining the data with hunches formed in interviews, they make worse decisions than they do going on facts alone….” (More)

Crowdsourcing Solutions for Disaster Response: Examples and Lessons for the US Government


Paper by David Becker, and Samuel Bendett in Procedia Engineering: “Crowdsourcing has become a quick and efficient way to solve a wide variety of problems – technical solutions, social and economic actions, fundraising and troubleshooting of numerous issues that affect both the private and the public sectors. US government is now actively using crowdsourcing to solve complex problems that previously had to be handled by a limited circle of professionals. This paper outlines several examples of how a Department of Defense project headquartered at the National Defense University is using crowdsourcing for solutions to disaster response problems….(More)”