Predictive Analytics


Revised book by Eric Siegel: “Prediction is powered by the world’s most potent, flourishing unnatural resource: data. Accumulated in large part as the by-product of routine tasks, data is the unsalted, flavorless residue deposited en masse as organizations churn away. Surprise! This heap of refuse is a gold mine. Big data embodies an extraordinary wealth of experience from which to learn.

Predictive analytics unleashes the power of data. With this technology, the computer literally learns from data how to predict the future behavior of individuals. Perfect prediction is not possible, but putting odds on the future drives millions of decisions more effectively, determining whom to call, mail, investigate, incarcerate, set up on a date, or medicate.

In this lucid, captivating introduction — now in its Revised and Updated edition — former Columbia University professor and Predictive Analytics World founder Eric Siegel reveals the power and perils of prediction:

    • What type of mortgage risk Chase Bank predicted before the recession.
    • Predicting which people will drop out of school, cancel a subscription, or get divorced before they even know it themselves.
    • Why early retirement predicts a shorter life expectancy and vegetarians miss fewer flights.
    • Five reasons why organizations predict death — including one health insurance company.
    • How U.S. Bank and Obama for America calculated — and Hillary for America 2016 plans to calculate — the way to most strongly persuade each individual.
    • Why the NSA wants all your data: machine learning supercomputers to fight terrorism.
    • How IBM’s Watson computer used predictive modeling to answer questions and beat the human champs on TV’s Jeopardy!
    • How companies ascertain untold, private truths — how Target figures out you’re pregnant and Hewlett-Packard deduces you’re about to quit your job.
    • How judges and parole boards rely on crime-predicting computers to decide how long convicts remain in prison.
    • 183 examples from Airbnb, the BBC, Citibank, ConEd, Facebook, Ford, Google, the IRS, LinkedIn, Match.com, MTV, Netflix, PayPal, Pfizer, Spotify, Uber, UPS, Wikipedia, and more….(More)”

 

‘Design thinking’ is changing the way we approach problems


Tim Johnson in University Affairs on “Why researchers in various disciplines are using the principles of design to solve problems big and small” : “A product of the same trends and philosophies that gave us smartphones, laptop computers and internet search engines, design thinking is changing the way some academics approach teaching and research, the way architects design classrooms and how leaders seek to solve the world’s most persistent problems.

Cameron Norman is a long-time supporter of design thinking (or DT) and an adjunct lecturer at the University of Toronto’s Dalla Lana School of Public Health. He notes that designers, especially product designers, are typically experts in conceptualizing problems and solving them– ideal skills for tackling a wide range of issues, from building a better kitchen table to mapping out the plans on a large building. “The field of design is the discipline of innovation,” he says. “[Design thinking] is about taking these methods, tools and ideas, and applying them in other areas.”

Design thinking centres on the free flow of ideas – far-out concepts aren’t summarily dismissed – and an unusually enthusiastic embrace of failure. “Design thinkers try to figure out what the key problem is – they look around and try to understand what’s going on, and come up with some wild ideas, thinking big and bold, on how to solve it,” Dr. Norman says. “They assume they’re not going to get it right the first time.”

If you were looking to build a better mousetrap, you’d prototype a model, test it for weaknesses, then either trash it and start again, or identify the problems and seek to correct them. DT does the same thing, but in an increasingly broad array of areas, from social policy to healthcare to business.

Deborah Shackleton, dean of design and dynamic media at Emily Carr University of Art + Design in Vancouver, was an early adopter of DT. “Design thinking is a mindset. You can use it as a tool or a technique. It’s very adaptable,” she says.

In 2005, ECUAD revamped much of its curriculum in design and dynamic media, looking to shift the focus from more traditional methods of research, like literature reviews, to something called “generative research.” “It’s the idea that you would invite the participants – for whom the design is intended – to be part of the creation process,” Dr. Shackleton says. She adds that various tools, like “co-creation kits” (which include a range of activities to engage people on a variety of cognitive and emotional levels) and ethnographic and cultural probes (activities which help participants demonstrate details about their private lives to their design partners), prove very useful in this area.

Collaboration among various fields is an important part of the design thinking process. At the University of Alberta, Aidan Rowe, an associate professor in design studies, is using design thinking to help the City of Edmonton improve services for people who are homeless. “Design is a truly interdisciplinary discipline,” says Dr. Rowe. “We always need someone to design with and for. We don’t design for ourselves.”….

Design thinkers often speak of “human-centered design” and “social innovation,” concepts that flow from DT’s assertion that no single person has the answer to a complex problem. Instead, it focuses on collective goals and places a premium on sustainability, community, culture and the empowerment of people, says Greg Van Alstyne, director of research and co-founder of the Strategic Innovation Lab, or sLab, at OCAD University. “It means you go about your problem-solving in a more holistic way. We can say ‘human-centered,’ but it’s actually ‘life-centered,’” Mr. Van Alstyne explains. “Our brand of design thinking is amenable to working within social systems and improving the lot of communities.”

 

Design thinking is also transforming university campuses in a tangible way. One example is at the University of Calgary’s Taylor Institute for Teaching and Learning, which is undergoing a $40-million renovation. “The whole space is designed to help students connect, communicate, collaborate and create knowledge,” says Lynn Taylor, vice-provost, teaching and learning. “Traditional learning was focused on the facts and concepts and procedures of a discipline, and we’re moving toward the goal of having students think far more deeply about their learning.”

To create this new space within a two-floor, 4,000-square-metre building that formerly served as an art museum, the university turned to Diamond Schmitt Architects, who have designed similar spaces at a number of other Canadian campuses. The new space, scheduled to open in February, prioritizes flexibility, with movable walls and collapsible furniture, and the seamless integration of technology.

Lead architect Don Schmitt observes that in a traditional campus building, which usually contains a long corridor and individual classrooms, conversation tends to gravitate to the only true public space: the hallway. “There’s a sense that more learning probably happens outside the classroom or between the classrooms, than happens inside the classroom,” he says.

Gone is the old-model lecture hall, with fixed podium and chairs. They’ve been replaced by a much more malleable space, which in a single day can act as a dance studio, movie theatre, lecture space, or just a big area for students to get together. “It’s about individual learning happening informally, quiet study, gregarious social activity, group study, group projects, flexible studio environments, changeable, ‘hack-able’ spaces and lots of flexibility to use different places in different ways,” Mr. Schmitt explains….(More)”

Humanity 360: World Humanitarian Data and Trends 2015


OCHA: “WORLD HUMANITARIAN DATA AND TRENDS

Highlights major trends, challenges and opportunities in the nature of humanitarian crises, showing how the humanitarian landscape is evolving in a rapidly changing world.

EXPLORE...

LEAVING NO ONE BEHIND: HUMANITARIAN EFFECTIVENESS IN THE AGE OF THE SUSTAINABLE DEVELOPMENT GOALS

Exploring what humanitarian effectiveness means in today’s world ‐ better meeting the needs of people in crisis, better moving people out of crisis.

EXPLORE

TOOLS FOR DATA COORDINATION AND COLLECTION

 

How Much Development Data Is Enough?


Keith D. Shepherd at Project Syndicate: “Rapid advances in technology have dramatically lowered the cost of gathering data. Sensors in space, the sky, the lab, and the field, along with newfound opportunities for crowdsourcing and widespread adoption of the Internet and mobile telephones, are making large amounts of information available to those for whom it was previously out of reach. A small-scale farmer in rural Africa, for example, can now access weather forecasts and market prices at the tap of a screen.

This data revolution offers enormous potential for improving decision-making at every level – from the local farmer to world-spanning development organizations. But gathering data is not enough. The information must also be managed and evaluated – and doing this properly can be far more complicated and expensive than the effort to collect it. If the decisions to be improved are not first properly identified and analyzed, there is a high risk that much of the collection effort could be wasted or misdirected.

This conclusion is itself based on empirical analysis. The evidence is weak, for example, that monitoring initiatives in agriculture or environmental management have had a positive impact. Quantitative analysis of decisions across many domains, including environmental policy, business investments, and cyber security, has shown that people tend to overestimate the amount of data needed to make a good decision or misunderstand what type of data are needed.

Furthermore, grave errors can occur when large data sets are mined using machine algorithms without having first having properly examined the decision that needs to be made. There are many examples of cases in which data mining has led to the wrong conclusion – including in medical diagnoses or legal cases – because experts in the field were not consulted and critical information was left out of the analysis.

Decision science, which combines understanding of behavior with universal principles of coherent decision-making, limits these risks by pairing empirical data with expert knowledge. If the data revolution is to be harnessed in the service of sustainable development, the best practices of this field must be incorporated into the effort.

The first step is to identify and frame frequently recurring decisions. In the field of development, these include large-scale decisions such as spending priorities – and thus budget allocations – by governments and international organizations. But it also includes choices made on a much smaller scale: farmers pondering which crops to plant, how much fertilizer to apply, and when and where to sell their produce.

The second step is to build a quantitative model of the uncertainties in such decisions, including the various triggers, consequences, controls, and mitigants, as well as the different costs, benefits, and risks involved. Incorporating – rather than ignoring – difficult-to-measure, highly uncertain factors leads to the best decisions…..

The third step is to compute the value of obtaining additional information – something that is possible only if the uncertainties in all of the variables have been quantified. The value of information is the amount a rational decision-maker would be willing to pay for it. So we need to know where additional data will have value for improving a decision and how much we should spend to get it. In some cases, no further information may be needed to make a sound decision; in others, acquiring further data could be worth millions of dollars….(More)”

The Innovation the Grantmaking Process Needs


Beth Simone Noveck and Andrew Young (TheGovLab) at Governing: “Although traditional grants provide greater flexibility than a contract for the recipient to decide how, precisely, to use the funds to advance a particular goal, prize-backed challenges like those on Challenge.gov have the potential to reach more diverse experts. Challenges are just one example of innovations in the grantmaking process being tested in government, philanthropy and the private sector. These innovations in “open grantmaking” have the potential to yield more legitimate and more accountable processes than their closed-door antecedents. They also have the potential to produce more creative strategies for solving problems and, ultimately, more effective outcomes.

Certainly the time has come for innovation in grantmaking. Despite its importance, we have a decidedly 20th-century system in place for deciding how we make these billions of dollars of crucial public investments. To make the most of limited funding — and help build confidence in the ability of public investments to make a positive difference — it is essential for our government agencies to try more innovative approaches to designing, awarding and measuring their grantmaking activities.

In most instances, grantmaking follows a familiar lifecycle: An agency describes and publicizes the grant in a public call for proposals, qualifying individuals or entities send in applications, and the agencies select the winners through internal deliberations. Members of the public — including outside experts, past grantees and service recipients — often have few opportunities to provide meaningful input before, during or after the granting process. And after awarding grants, the agencies themselves usually have limited continuing interactions with those they fund.

The current closed-door system, to be sure, developed to safeguard the legitimacy and fairness of the process. From application to judging, most government grantmaking has been confidential and at arm’s length. For statutory, regulatory or even cultural reasons, the grantmaking process in many agencies is characterized by caution rather than by creativity.

But it doesn’t always have to be this way, and new, more open grantmaking innovations might prove to be more effective in many contexts. Here are 10 recommendations for innovating the grantmaking process drawn from examples of how government agencies, foundations and philanthropists are changing how they give out money:…(More)”

HereHere


HereHere NYC generates weekly cartoons for NYC neighborhoods based on public data. We sum up how your neighborhood, or other NYC neighborhoods you care about, are doing via weekly email digest, neighborhood-specific Twitter & Instagram feeds, and with deeper data and context.

HereHere is a research project from FUSE Labs, Microsoft Research that explores:

  • Creating compelling stories with data to engage larger communities
  • Inventing new habits for connecting to the hyperlocal
  • Using cartoons as a tool to drive data engagement

HereHere does not use sentiment analysis, but uses a research platform with the intention of surfacing the most pertinent information with a human perspective. …

How It Works

Several times a day we grab the freshest NYC 311 data. The data comes in as a long list of categorized concerns issued by people in NYC (either via phone, email, or text message) and range from heating complaints to compliments to concerns about harboring bees and everything in between.

We separate the data by neighborhood for each of the 42 neighborhoods throughout the 5 boroughs of NYC, and count the total of each concern per neighborhood.

Next, we process the data through the Sentient Data Server. SDS equips each neighborhood with a personality (like a character in a movie or videogame) and we calculate the character’s response to the latest data based on pace, position and trend. For example, a neighborhood might be delighted if after several days of more than 30 heating complaints, heating complaints drops down to 0; or a neighborhood might be ashamed to see a sudden rise in homeless person assistance requests.

 

HereHere determines the most critical 311 issues for each neighborhood each week and uses that to procedurally generate a weekly cartoon for each neighborhood.

 HereHere summarizes the 311 concerns into categories for a quick sense of what’s happening in each neighborhood…(More)

Collective Intelligence in Law Reforms: When the Logic of the Crowds and the Logic of Policymaking Collide


Paper by Tanja Aitamurto: “…shows how the two virtues of collective intelligence – cognitive diversity and large crowds –turn into perils in crowdsourced policymaking. That is because of a conflict between the logic of the crowds and the logic of policymaking. The crowd’s logic differs from that of traditional policymaking in several aspects. To mention some of those: In traditional policymaking it is a small group of experts making proposals to the policy, whereas in crowdsourced policymaking, it is a large, anonymous crowd with a mixed level of expertise. The crowd proposes atomic ideas, whereas traditional policymaking is used to dealing with holistic and synthesized proposals. By drawing on data from a crowdsourced law-making process in Finland, the paper shows how the logics of the crowds and policymaking collide in practice. The conflict prevents policymaking fully benefiting from the crowd’s input, and it also hinders governments from adopting crowdsourcing more widely as a practice for deploying open policymaking practices….(More)”

How Facebook Makes Us Dumber


 in BloombergView: “Why does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behavior of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?

Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages). Their data set is massive: It covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.

In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.

More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it….(More)”

Initial Conditions Matter: Social Capital and Participatory Development


Paper by Lisa A. Cameron et al: “Billions of dollars have been spent on participatory development programs in the developing world. These programs give community members an active decision-making role. Given the emphasis on community involvement, one might expect that the effectiveness of this approach would depend on communities’ pre-existing social capital stocks. Using data from a large randomised field experiment of Community-Led Total Sanitation in Indonesia, we find that villages with high initial social capital built toilets and reduced open defecation, resulting in substantial health benefits. In villages with low initial stocks of social capital, the approach was counterproductive – fewer toilets were built than in control communities and social capital suffered….(More)”

Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues


Press Release: “A new report from the Federal Trade Commission outlines a number of questions for businesses to consider to help ensure that their use of big data analytics, while producing many benefits for consumers, avoids outcomes that may be exclusionary or discriminatory.

“Big data’s role is growing in nearly every area of business, affecting millions of consumers in concrete ways,” said FTC Chairwoman Edith Ramirez. “The potential benefits to consumers are significant, but businesses must ensure that their big data use does not lead to harmful exclusion or discrimination.”

The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at big data at the end of its lifecycle – how it is used after being collected and analyzed, and draws on information from the FTC’s 2014 workshop, “Big Data: A Tool for Inclusion or Exclusion?,” as well as the Commission’s seminar on Alternative Scoring Products. The Commission also considered extensive public comments and additional public research in compiling the report.

The report highlights a number of innovative uses of big data that are providing benefits to underserved populations, including increased educational attainment, access to credit through non-traditional methods, specialized health care for underserved communities, and better access to employment.

In addition, the report looks at possible risks that could result from biases or inaccuracies about certain groups, including more individuals mistakenly denied opportunities based on the actions of others, exposing sensitive information, creating or reinforcing existing disparities, assisting in the targeting of vulnerable consumers for fraud, creating higher prices for goods and services in lower-income communities and weakening the effectiveness of consumer choice.

The report outlines some of the various laws that apply to the use of big data, especially in regards to possible issues of discrimination or exclusion, including the Fair Credit Reporting Act, FTC Act and equal opportunity laws. It also provides a range of questions for businesses to consider when they examine whether their big data programs comply with these laws.

The report also proposes four key policy questions that are drawn from research into the ways big data can both present and prevent harms. The policy questions are designed to help companies determine how best to maximize the benefit of their use of big data while limiting possible harms, by examining both practical questions of accuracy and built-in bias as well as whether the company’s use of big data raises ethical or fairness concerns….(More)”