Innovation Experiments: Researching Technical Advance, Knowledge Production and the Design of Supporting Institutions


Paper by Kevin J. Boudreau and Karim Lakhani: “This paper discusses several challenges in designing field experiments to better understand how organizational and institutional design shapes innovation outcomes and the production of knowledge. We proceed to describe the field experimental research program carried out by our Crowd Innovation Laboratory at Harvard University to clarify how we have attempted to address these research design challenges. This program has simultaneously solved important practical innovation problems for partner organizations, like NASA and Harvard Medical School, while contributing research advances, particularly in relation to innovation contests and tournaments….(More)

Citizen Sensor Data Mining, Social Media Analytics and Applications


Paper by Amit P. Sheth: “With the rapid rise in the popularity of social media (1B+ Facebook users, 200M+ twitter users), and near ubiquitous mobile access (4+ billion actively-used mobile phones), the sharing of observations and opinions has become common-place (500M+ tweets a day). This has given us an unprecedented access to the pulse of a populace and the ability to perform analytics on social data to support a variety of socially intelligent applications — be it for brand tracking and management, crisis coordination, organizing revolutions or promoting social development in underdeveloped and developing countries. I will review: 1) understanding and analysis of informal text, esp. microblogs (e.g., issues of cultural entity extraction and role of semantic/background knowledge enhanced techniques), and 2) how we built Twitris, a comprehensive social media analytics (social intelligence) platform. I will describe the analysis capabilities along three dimensions: spatio-temporal-thematic, people-content-network, and sentiment-emption-intent. I will couple technical insights with identification of computational techniques and real-world examples using live demos of Twitris….(More)”

The data or the hunch


Ian Leslie at Intelligent Life: “THE GIFT FOR talent-spotting is mysterious, highly prized and celebrated. We love to hear stories about the baseball coach who can spot the raw ability of an erratic young pitcher, the boss who sees potential in the guy in the post room, the director who picks a soloist out of the chorus line. Talent shows are a staple of the TV schedules. We like to believe that certain people—sometimes ourselves—can just sense when a person has something special. But there is another method of spotting talent which doesn’t rely on hunches. In place of intuition, it offers data and analysis. Rather than relying on the gut, it invites us to use our heads. It tends not to make for such romantic stories, but it is effective—which is why, despite our affection, the hunch is everywhere in retreat.

Strike one against the hunch was the publication of “Moneyball” by Michael Lewis (2003), which has attained the status of a management manual for many in sport and beyond. Lewis reported on a cash-strapped major-league baseball team, the Oakland A’s, who enjoyed unlikely success against bigger and better-funded competitors. Their secret sauce was data. Their general manager, Billy Beane, had realised that when it came to evaluating players, the gut instincts of experienced baseball scouts were unreliable, and he employed statisticians to identify talent overlooked by the big clubs…..

These days, when a football club is interested in a player, it considers the average distance he runs in a game, the number of passes and tackles or blocks he makes, his shots on goal, the ratio of goals to shots, and many other details nobody thought to measure a generation ago. Sport is far from the only industry in which talent-spotting is becoming a matter of measurement. Prithwijit Mukerji, a postgraduate at the University of Westminster in London, recently published a paper on the way the music industry is being transformed by “the Moneyball approach”. By harvesting data from Facebook and Twitter and music services like Spotify and Shazam, executives can track what we are listening to in far more detail than ever before, and use it as a guide to what we will listen to next….

This is the day of the analyst. In education, academics are working their way towards a reliable method of evaluating teachers, by running data on test scores of pupils, controlled for factors such as prior achievement and raw ability. The methodology is imperfect, but research suggests that it’s not as bad as just watching someone teach. A 2011 study led by Michael Strong at the University of California identified a group of teachers who had raised student achievement and a group who had not. They showed videos of the teachers’ lessons to observers and asked them to guess which were in which group. The judges tended to agree on who was effective and ineffective, but, 60% of the time, they were wrong. They would have been better off flipping a coin. This applies even to experts: the Gates Foundation funded a vast study of lesson observations, and found that the judgments of trained inspectors were highly inconsistent.

THE LAST STRONGHOLD of the hunch is the interview. Most employers and some universities use interviews when deciding whom to hire or admit. In a conventional, unstructured interview, the candidate spends half an hour or so in a conversation directed at the whim of the interviewer. If you’re the one deciding, this is a reassuring practice: you feel as if you get a richer impression of the person than from the bare facts on their résumé, and that this enables you to make a better decision. The first theory may be true; the second is not.

Decades of scientific evidence suggest that the interview is close to useless as a tool for predicting how someone will do a job. Study after study has found that organisations make better decisions when they go by objective data, like the candidate’s qualifications, track record and performance in tests. “The assumption is, ‘if I meet them, I’ll know’,” says Jason Dana, of Yale School of Management, one of many scholars who have looked into the interview’s effectiveness. “People are wildly over-confident in their ability to do this, from a short meeting.” When employers adopt a holistic approach, combining the data with hunches formed in interviews, they make worse decisions than they do going on facts alone….” (More)

Crowdsourcing Solutions for Disaster Response: Examples and Lessons for the US Government


Paper by David Becker, and Samuel Bendett in Procedia Engineering: “Crowdsourcing has become a quick and efficient way to solve a wide variety of problems – technical solutions, social and economic actions, fundraising and troubleshooting of numerous issues that affect both the private and the public sectors. US government is now actively using crowdsourcing to solve complex problems that previously had to be handled by a limited circle of professionals. This paper outlines several examples of how a Department of Defense project headquartered at the National Defense University is using crowdsourcing for solutions to disaster response problems….(More)”

 

Democratising the Data Revolution


Jonathan Gray at Open Knowledge: “What will the “data revolution” do? What will it be about? What will it count? What kinds of risks and harms might it bring? Whom and what will it serve? And who will get to decide?

Today we are launching a new discussion paper on “Democratising the Data Revolution”, which is intended to advance thinking and action around civil society engagement with the data revolution. It looks beyond the disclosure of existing information, towards more ambitious and substantive forms of democratic engagement with data infrastructures.1

It concludes with a series of questions about what practical steps institutions and civil society organisations might take to change what is measured and how, and how these measurements are put to work.

You can download the full PDF report here, or continue to read on in this blog post.

What Counts?

How might civil society actors shape the data revolution? In particular, how might they go beyond the question of what data is disclosed towards looking at what is measured in the first place? To kickstart discussion around this topic, we will look at three kinds of intervention: changing existing forms of measurement, advocating new forms of measurement and undertaking new forms of measurement.

Changing Existing Forms of Measurement

Rather than just focusing on the transparency, disclosure and openness of public information, civil society groups can argue for changing what is measured with existing data infrastructures. One example of this is recent campaigning around company ownership in the UK. Advocacy groups wanted to unpick networks of corporate ownership and control in order to support their campaigning and investigations around tax avoidance, tax evasion and illicit financial flows.

While the UK company register recorded information about “nominal ownership”, it did not include information about so-called “beneficial ownership”, or who ultimately benefits from the ownership and control of companies. Campaigners undertook an extensive programme of activities to advocate for changes and extensions to existing data infrastructures – including via legislation, software systems, and administrative protocols.2

Advocating New Forms of Measurement

As well as changing or recalibrating existing forms of measurement, campaigners and civil society organisations can make the case for the measurement of things which were not previously measured. For example, over the past several decades social and political campaigning has resulted in new indicators about many different issues – such as gender inequality, health, work, disability, pollution or education.3 In such cases activists aimed to establish a given indicator as important and relevant for public institutions, decision makers, and broader publics – in order to, for example, inform policy development or resource allocation.

Undertaking New Forms of Measurement

Historically, many civil society organisations and advocacy groups have collected their own data to make the case for action on issues that they work on – from human rights abuses to endangered species….(More)”

Mining citizen emotions to estimate the urgency of urban issues


Christian Masdeval and Adriano Veloso in Information Systems: “Crowdsourcing technology offers exciting possibilities for local governments. Specifically, citizens are increasingly taking part in reporting and discussing issues related to their neighborhood and problems they encounter on a daily basis, such as overflowing trash-bins, broken footpaths and lifts, illegal graffiti, and potholes. Pervasive citizen participation enables local governments to respond more efficiently to these urban issues. This interaction between citizens and municipalities is largely promoted by civic engagement platforms, such as See-Click-Fix, FixMyStreet, CitySourced, and OpenIDEO, which allow citizens to report urban issues by entering free text describing what needs to be done, fixed or changed. In order to develop appropriate action plans and priorities, government officials need to figure out how urgent are the reported issues. In this paper we propose to estimate the urgency of urban issues by mining different emotions that are implicit in the text describing the issue. More specifically, a reported issue is first categorized according to the emotions expressed in it, and then the corresponding emotion scores are combined in order to produce a final urgency level for the reported issue. Our experiments use the SeeClickFix hackathon data and diverse emotion classification algorithms. They indicate that (i) emotions can be categorized efficiently with supervised learning algorithms, and (ii) the use of citizen emotions leads to accurate urgency estimates. Further, using additional features such as the type of issue or its author leads to no further accuracy gains….(More)”

Using social media in hotel crisis management: the case of bed bugs


Social media has helped to bridge the communication gap between customers and hotels. Bed bug infestations are a growing health crisis and have obtained increasing attention on social media sites. Without managing this crisis effectively, bed bug infestation can cause economic loss and reputational damages to hotel properties, ranging from negative comments and complaints, to possible law suits. Thus, it is essential for hoteliers to understand the importance of social media in crisis communication, and to incorporate social media in hotels’ crisis management plans.

This study serves as one of the first attempts in the hospitality field to offer discussions and recommendations on how hotels can manage the bed bug crisis and other crises of this kind by incorporating social media into their crisis management practices….(More)”

Defining Public Engagement: A four-level approach.


Della Rucker’s Chapter 2 for an Online Public Engagement Book: “….public engagement typically means presenting information on an project or draft plan and addressing questions or comments. For planners working on long-range issues, such as a comprehensive plan, typical public engagement actions may include feedback questions, such as “what should this area look like?” or “what is your vision for the future of the neighborhood?” Such questions, while inviting participants to take a more active role in the community decision-making than the largely passive viewer/commenter in the first example, still places the resident in a peripheral role: that of an information source, functionally similar to the demographic data and GIS map layers that the professionals use to develop plans.

In a relatively small number of cases, planners and community advocates have found more robust and more direct means of engaging residents in decision -making around the future of their communities. Public engagement specialists, often originating from a community development or academic background, have developed a variety of methods, such as World Cafe and the Fishbowl, that are designed to facilitate more meaningful sharing of information among community residents, often as much with the intent of building connectivity and mutual understanding among residents of different backgrounds as for the purpose of making policy decisions.

Finally, a small but growing number of strategies have begun to emerge that place the work of making community decisions directly in the hands of private residents. Participatory -based budgeting allocates the decision about how to use a portion of a community’s budget to a citizen — based process, and participants work collaboratively through a process that determines what projects or initiatives will be funded in then coming budget cycle. And in the collection of tactics generally known as tactical urbanism or [other names], residents directly intervene in the physical appearance or function of the community by building and placing street furniture, changing parking spaces or driving lanes to pedestrian use, creating and installing new signs, or making other kinds of physical, typically temporary, changes — sometimes with, and sometimes without, the approval of the local government. The purposes of tactical urbanist interventions are twofold: they physically demonstrate the potential impact that more permanent features would have on the community’s transportation and quality of life, and they give residents a concrete and immediate opportunity to impact their environs.

The direct impacts of either participatory budgeting or tactical urbanism intiatives tend to be limited — the amount of budget available for a participatory-based budgeting initiative is usually a fraction of the total budget, and the physical area impacted by a tactical urbanism event is generally limited to a few blocks. Anecdotal evidence from both types of activity, however, seems to indicate an increased understanding of community needs and an increased sense of agency -of having the power to influence one’s community’s future — among participants.

Online public engagement methods have the potential to facilitate a wide variety of public engagement, from making detailed project information more readily available to enabling crowdsourced decision-making around budget and policy choices. However, any discussion of online public engagement methods will soon run up against the same basic challenge: when we use that term, what kind of engagement — what kind of participant experience — are we talking about?

We could divide public participation tasks according to one of several existing organization systems, or taxonomies. The two most commonly used in public engagement theory and practice derive from Sherry R. Arnestein’s 1969 academic paper, “A Ladder of Citizen Participation,” and the International Association of Public Participation’s Public Participation Spectrum.

Although these two taxonomies reflect the same basic idea — that one’s options in selecting public engagement activities range along a spectrum from generally less to more active engagement on the part of the public — they divide and label the classifications differently. …From my perspective, both of these frameworks capture the central issue of recognizing more to less intensive public engagement options, but the number of divisions and the sometimes abstract wording appears to have made it difficult for these insights to find widespread use outside of an academic context. Practitioners who need to think though these options seem to have some tendency to become tangled in the fine-grained differentiations, and the terminology can both make these distinctions harder to think about and lead to mistaken assumption that one is doing higher-level engagement that is actually the case. Among commercial online public engagement platform providers, blog posts claiming that their tool addresses the whole Spectrum appear on a relatively regular basis, even when the tool in questions is designed for feedback, not decision -making.

For these reasons, this book will use the following framework of engagement types, which is detailed enough to demarcate what I think are the most crucial differentiations while at the same time keeping the framework simple enough to use in routine process planning.

The four engagement types we will talk about are: Telling; Asking; Discussing; Deciding…(More)”

Data, Human Rights & Human Security


Paper by Mark Latonero and  Zachary Gold“In today’s global digital ecosystem, mobile phone cameras can document and distribute images of physical violence. Drones and satellites can assess disasters from afar. Big data collected from social media can provide real-time awareness about political protests. Yet practitioners, researchers, and policymakers face unique challenges and opportunities when assessing technological benefit, risk, and harm. How can these technologies be used responsibly to assist those in need, prevent abuse, and protect people from harm?”

Mark Latonero and Zachary Gold address the issues in this primer for technologists, academics, business, governments, NGOs, intergovernmental organizations — anyone interested in the future of human rights and human security in a data-saturated world….(Download PDF)”

From Mechanism to Virtue: Evaluating Nudge-Theory


Paper by van der Heijden, Jeroen and Kosters, Mark: “Ever since Thaler and Sunstein published their influential Nudge, the book and the theory it presents have received great praise and opposition. Nudge-theory, and more particularly, nudging may be considered an additional strategy providing some novel instruments to the already rich governance toolbox. But what is its value? The current debates on Nudge-theory are often highly normative or ideologically driven and pay limited attention to more practical aspects of the theory: Whether and how is nudging evaluable as a theory and a practice? Whether there is solid evidence available of nudge success over other governance interventions? What is to be considered a nudge success at all? What data and evaluative techniques may assist in evaluating nudging beyond individual cases? The current article seeks to explore these questions… (More)”