How Much Development Data Is Enough?


Keith D. Shepherd at Project Syndicate: “Rapid advances in technology have dramatically lowered the cost of gathering data. Sensors in space, the sky, the lab, and the field, along with newfound opportunities for crowdsourcing and widespread adoption of the Internet and mobile telephones, are making large amounts of information available to those for whom it was previously out of reach. A small-scale farmer in rural Africa, for example, can now access weather forecasts and market prices at the tap of a screen.

This data revolution offers enormous potential for improving decision-making at every level – from the local farmer to world-spanning development organizations. But gathering data is not enough. The information must also be managed and evaluated – and doing this properly can be far more complicated and expensive than the effort to collect it. If the decisions to be improved are not first properly identified and analyzed, there is a high risk that much of the collection effort could be wasted or misdirected.

This conclusion is itself based on empirical analysis. The evidence is weak, for example, that monitoring initiatives in agriculture or environmental management have had a positive impact. Quantitative analysis of decisions across many domains, including environmental policy, business investments, and cyber security, has shown that people tend to overestimate the amount of data needed to make a good decision or misunderstand what type of data are needed.

Furthermore, grave errors can occur when large data sets are mined using machine algorithms without having first having properly examined the decision that needs to be made. There are many examples of cases in which data mining has led to the wrong conclusion – including in medical diagnoses or legal cases – because experts in the field were not consulted and critical information was left out of the analysis.

Decision science, which combines understanding of behavior with universal principles of coherent decision-making, limits these risks by pairing empirical data with expert knowledge. If the data revolution is to be harnessed in the service of sustainable development, the best practices of this field must be incorporated into the effort.

The first step is to identify and frame frequently recurring decisions. In the field of development, these include large-scale decisions such as spending priorities – and thus budget allocations – by governments and international organizations. But it also includes choices made on a much smaller scale: farmers pondering which crops to plant, how much fertilizer to apply, and when and where to sell their produce.

The second step is to build a quantitative model of the uncertainties in such decisions, including the various triggers, consequences, controls, and mitigants, as well as the different costs, benefits, and risks involved. Incorporating – rather than ignoring – difficult-to-measure, highly uncertain factors leads to the best decisions…..

The third step is to compute the value of obtaining additional information – something that is possible only if the uncertainties in all of the variables have been quantified. The value of information is the amount a rational decision-maker would be willing to pay for it. So we need to know where additional data will have value for improving a decision and how much we should spend to get it. In some cases, no further information may be needed to make a sound decision; in others, acquiring further data could be worth millions of dollars….(More)”

Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues


Press Release: “A new report from the Federal Trade Commission outlines a number of questions for businesses to consider to help ensure that their use of big data analytics, while producing many benefits for consumers, avoids outcomes that may be exclusionary or discriminatory.

“Big data’s role is growing in nearly every area of business, affecting millions of consumers in concrete ways,” said FTC Chairwoman Edith Ramirez. “The potential benefits to consumers are significant, but businesses must ensure that their big data use does not lead to harmful exclusion or discrimination.”

The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at big data at the end of its lifecycle – how it is used after being collected and analyzed, and draws on information from the FTC’s 2014 workshop, “Big Data: A Tool for Inclusion or Exclusion?,” as well as the Commission’s seminar on Alternative Scoring Products. The Commission also considered extensive public comments and additional public research in compiling the report.

The report highlights a number of innovative uses of big data that are providing benefits to underserved populations, including increased educational attainment, access to credit through non-traditional methods, specialized health care for underserved communities, and better access to employment.

In addition, the report looks at possible risks that could result from biases or inaccuracies about certain groups, including more individuals mistakenly denied opportunities based on the actions of others, exposing sensitive information, creating or reinforcing existing disparities, assisting in the targeting of vulnerable consumers for fraud, creating higher prices for goods and services in lower-income communities and weakening the effectiveness of consumer choice.

The report outlines some of the various laws that apply to the use of big data, especially in regards to possible issues of discrimination or exclusion, including the Fair Credit Reporting Act, FTC Act and equal opportunity laws. It also provides a range of questions for businesses to consider when they examine whether their big data programs comply with these laws.

The report also proposes four key policy questions that are drawn from research into the ways big data can both present and prevent harms. The policy questions are designed to help companies determine how best to maximize the benefit of their use of big data while limiting possible harms, by examining both practical questions of accuracy and built-in bias as well as whether the company’s use of big data raises ethical or fairness concerns….(More)”

Social Media for Government Services


Book edited by Surya Nepal, Cécile Paris and Dimitrios Georgakopoulos: “This book highlights state-of-the-art research, development and implementation efforts concerning social media in government services, bringing together researchers and practitioners in a number of case studies. It elucidates a number of significant challenges associated with social media specific to government services, such as:  benefits and methods of assessing; usability and suitability of tools, technologies and platforms; governance policies and frameworks; opportunities for new services; integrating social media with organisational business processes; and specific case studies. The book also highlights the range of uses and applications of social media in the government domain, at both local and federal levels. As such, it offers a valuable resource for a broad readership including academic researchers, practitioners in the IT industry, developers, and government policy- and decision-makers….(More)

Living Labs: Concepts, Tools and Cases


Introduction by , : “This special issue on “Living labs: concepts, tools and cases” comes 10 years after the first scientific publications that defined the notion of living labs, but more than 15 years after the appearance of the first living lab projects (Ballon et al., 2005; Eriksson et al., 2005). This five-year gap demonstrates the extent to which living labs have been a practice-driven phenomenon. Right up to this day, they represent a pragmatic approach to innovation (of information and communication technologies [ICTs] and other artefacts), characterised by a.o. experimentation in real life and active involvement of users.

While there is now a certain body of literature that attempts to clarify and analyse the concept (Følstad, 2008; Almirall et al., 2012; Leminen et al., 2012), living lab practices are still under-researched, and a theoretical and methodological gap continues to exist in terms of the restricted amount and visibility of living lab literature vis-à-vis the rather large community of practice (Schuurman, 2015). The present special issue aims to assist in filling that gap.

This does not mean that the development of living labs has not been informed by scholarly literature previously (Ballon, 2015). Cornerstones include von Hippel’s (1988) work on user-driven innovation because of its emphasis on the ability of so-called lead users, rather than manufacturers, to create (mainly ICT) innovations. Another cornerstone is Silverstone’s (1993) theory on the domestication of ICTs that frames technology adoption as an ongoing struggle between users and technology where the user attempts to take control of the technological artefact and the technology comes to be fitted to users’ daily routines. It has been said that, in living labs, von Hippel’s concept of user-driven design and Silverstone’s insights into the appropriation of technologies are coupled dynamically through experimentation (Frissen and Van Lieshout, 2006).

The concept of stigmergy, which refers to addressing complex problems by collective, yet uncoordinated, actions and interactions of communities of individuals, has gradually become the third foundational element, as social media have provided online platforms for stigmergic behaviour, which has subsequently been linked to the “spontaneous” emergence of innovations (Pallot et al., 2010; Kiemen and Ballon, 2012). A fourth cornerstone is the literature on open and business model innovation, which argues that today’s fast-paced innovation landscape requires collaboration between multiple business and institutional stakeholders, and that the business should use these joint innovation endeavours to find the right “business architecture” (Chesbrough, 2003; Mitchell and Coles, 2003).….(More)

Big Data Analysis: New Algorithms for a New Society


Book edited by Nathalie Japkowicz and Jerzy Stefanowski: “This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area.

It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued concerning the potential dangers of Big Data Analysis along with its pitfalls and challenges….(More)”

Danish city uses sensor system to understand Christmas shoppers


Springwise: “The success of a Christmas market or winterfete doesn’t always translate to money spent, it may simply increase foot traffic or visitor dwell time. Now, the Danish city of Aalborg is measuring exactly those quantities during its busy Christmas shopping period, using Bliptrack, the sensor system that detects devices that are using wifi.

aalborg-1-bliptrack-wifi-retail-smartcities

We have already seen Bliptrack used in JFK airport to let passengers know their wait times. Now, Aalborg City Business Association has installed the system in the city centre to track visitor behavior.

The system consists of a number of sensors placed around the city, which detect nearby wifi devices such as smartphones and tablets. As a pedestrian or car moves around from point to point, they are detected by each sensor. Each device has a unique MAC address, meaning the system is able to track the user’s journey and how long they took to get from one sensor to the next. Aalborg can then use the data collected to understand the impact of events, as well as visitors’ shopping activities. The insights can help them improve business operations such as opening at optimum hours and providing the right amount of staff….(More)”

Guffipedia: a dictionary of business jargon


Lucy Kellaway in the Financial Times: ” At this time of year, my mind naturally turns to guff. Every December I open the cupboard in which I store the worst examples of the year’s jargon and begin the search for winners of my annual Golden Flannel awards.

This year, as ever, the cupboard is stuffed with ugly words and phrases that people have written or spoken in 2015. To pick a few at random, there is “passionpreneur”. There is delta (to mean gap). There is to solutionize, to mindshare and even to role-model. All are new. All reach new linguisticlows….

To this end we have created Guffipedia, a repository for the terms that I’verailed at over the years. You will find previous years’ Golden Flannel winners with chapter-and-verse from me on why they are so ghastly (in case you are too steeped in the stuff to be able to work it out for yourself)….The point of Guffipedia is not just for you to admire the extent of my guff collection, but to help me curate it going forward, as they say in

The point of Guffipedia is not just for you to admire the extent of my guff collection, but to help me curate it going forward, as they say in Guffish.

I am urging you to submit horrible new words or phrases, to have a stab at translating them into serviceable English, and to state where you found them. You don’t need to name the perpetrator (though it would be nice if you did). “Heard in a lift” is fine — so long as it actually was. And if you get your entries in before the end of the year, they may end up winning a prize in my 2015 Golden Flannel awards, announced the first week inJanuary….See Guffipedia

 

 

Join Campaigns, Shop Ethically, Hit the Man Where It Hurts—All Within an App


PSFK: “Buycott is an app that wants to make shopping ethically a little easier. Join campaigns to support causes you care about, scan barcodes on products you’re considering to view company practices, and voice your concerns directly through the free app, available for iOS and Android.

buycott-app-1-psfk

Ethical campaigns are crowdsourced, and range from environmental to political and social concerns. Current campaigns include a demand for GMO labeling, supporting fair trade, ending animal testing, and more. You can read a description of the issue in question and see a list of companies to avoid and support under each campaign.

Scan barcodes of potential purchases to see if it the parent companies behind them hold up to your ethical standards. If the company doesn’t stand up, the app will suggest more ethically aligned alternatives.

buycott-app-2-psfk

According to Ivan Pardo, founder and CEO of Buycott, the app is designed to help consumers make informed shopping decisions “they can feel good about.”

“As consumers become increasingly conscientious about the impact of their purchases and shop to reflect these principles, Buycott provides users with transparency into the business practices of the companies marketing to them.”

Users can contact problematic companies through the app, using email, Facebook or Twitter. The app traces each product all the way back to its umbrella or parent company (which means the same few corporate giants are likely to show up on a few do-not buy lists)….(More)

New frontiers in social innovation research


Geoff Mulgan: “Nesta has published a new book with Palgrave which contains an introduction by me and many important chapters from leading academics around the world. I hope that many people will read it, and think about it, because it challenges, in a highly constructive way, many of the rather tired assumptions of the London media/political elite of both left and right.

The essay is by Roberto Mangabeira Unger, perhaps the world’s most creative and important contemporary intellectual. He is Professor of Law at Harvard (where he taught Obama); a philosopher and political theorist; author of one of the most interesting recent books on religion; co-author of an equally ground-breaking recent book on theoretical physics; and serves as strategy minister in the Brazilian government.

His argument is that a radically different way of thinking about politics, government and social change is emerging, which has either not been noticed by many political leaders, or misinterpreted. The essence of the argument is that practice is moving faster than theory; that systematic experimentation is a faster way to solve problems than clever authorship of pamphlets, white papers and plans; and that societies have the potential to be far more active agents of their own future than we assume.

The argument has implications for many fields. One is think-tanks. Twenty years ago I set up a think-tank, Demos. At that time the dominant model for policy making was to bring together some clever people in a capital city to write pamphlets, white papers and then laws. In the 1950s to 1970s a primary role was played by professors in universities, or royal commissions. Then it shifted to think-tanks. Sometimes teams within governments played a similar role – and I oversaw several of these, including the Strategy Unit in government. All saw policy as an essentially paper-based process, involving a linear transmission from abstract theories and analyses to practical implementation.

There’s still an important role to be played by think-tanks. But an opposite approach has now become common, and is promoted by Unger. In this approach, practice precedes theory. Experiment in the real world drives the development of new ideas – in business, civil society, and on the edges of the public sector. Learning by doing complements, and often leads analysis. The role of the academics and think-tanks shifts from inventing ideas to making sense of what’s emerging, and generalising it. Policies don’t try to specify every detail but rather set out broad directions and then enable a process of experiment and discovery.

As Unger shows, this approach has profound philosophical roots (reaching back to the 19th century pragmatists and beyond), and profound political implications (it’s almost opposite to the classic Marxist view, later adopted by the neoliberal right, in which intellectuals define solutions in theory which are then translated into practice). It also has profound implications for civil society – which he argues should adopt a maximalist rather than a minimalist view of social innovation.

The Unger approach doesn’t work for everything – for example, constitutional reform. But it is a superior method for improving most of the fields where governments have power – from welfare and health, to education and economic policy, and it has worked well for Nesta – evolving new models of healthcare, working with dozens of governments to redesign business policy, testing out new approaches to education.

The several hundred public sector labs and innovation teams around the world – from Chile to China, south Africa to Denmark – share this ethos too, as do many political leaders. Michael Bloomberg has been an exemplar, confident enough to innovate and experiment constantly in his time as New York Mayor. Won Soon Park in Korea is another…..

Unger’s chapter should be required reading for anyone aspiring to play a role in 21st century politics. You don’t have to agree with what he says. But you do need to work out where you disagree and why….(New Frontiers in Social Innovation Research)

The Problem-Solving Process That Prevents Groupthink


Art Markman at Harvard Business Review: “There are two reasons most of us aren’t very good at creative problem solving. First, few people get training in how to be creative in their education. Second, few people understand group dynamics well enough to harness their power to help groups maximize their creativity.

Resolving the first issue requires getting your employees to learn more about the way they think… a tall order for managers. The second issue, though, is well within your ability to change.

A key element of creativity is bringing existing knowledge to bear on a new problem or goal. The more people who can engage with that problem or goal, the more knowledge that is available to work on it. Unfortunately, quite a bit of research demonstrates that the traditional brainstorming methods first described by Alex Osborn in the 1950’s fail. When groups simply get together and start throwing out ideas, they actually come up with fewer ideas overall and fewer novel, actionable ideas than the individuals in that group would have come up with had they worked alone.

To fix this problem, it is important to think about the two phases of group problem-solving: divergence and convergence.

Divergence happens when the group considers as many different potential solutions as possible. For example, a common test of creativity is the “alternative uses” test. People are asked questions like, “How many different uses can you find for a brick?” This test requires strategies for considering as many distinct solutions as possible.

Convergence happens when the variety of proposed solutions are evaluated. In this phase, a large number of ideas are whittled to a smaller set of candidate solutions to the current problem.

The core principle of group creativity is that individuals working alone diverge, while group members working together converge. In group settings, as soon as one person states a potential solution to everyone else, that influences the memory of every person in the group in ways that make everyone think about the problem more similarly. That is why groups working together diverge less than individuals working alone.

To fix group idea generation, then, be aware of when you are trying to diverge and when you are trying to converge. For example, early in the process of problem-solving, think carefully about the problem itself. Have your group members work alone to craft statements describing the problem. Then, get them back together to discuss their descriptions. The individuals are likely to come up with a variety of distinct problem statements. The group discussion will lead everyone to accept one or a small number of variants of these statements to work on – this is healthy convergence.

When you start to generate solutions, you again want divergence. Again, have people work alone to start. Then collect people’s initial ideas and send them around to other group members and allow the divergence to continue as group members individually build on the ideas of their colleagues. Because people are still working alone, the way they build on other people’s ideas is still going to be different from how other group members are building on those ideas.

After this process, you can give the resulting ideas to everyone and then let the group get together to discuss them. This discussion will gradually lead the group to converge on a small number of candidate solutions….(More)”