The city as platform


The report of the 2015 Aspen Institute Roundtable on Information Technology: “In the age of ubiquitous Internet connections, smartphones and data, the future vitality of cities is increasingly based on their ability to use digital networks in intelligent, strategic ways. While we are accustomed to thinking of cities as geophysical places governed by mayors, conventional political structures and bureaucracies, this template of city governance is under great pressure to evolve. Urban dwellers now live their lives in all sorts of hyper-connected virtual spaces, pulsating with real-time information, intelligent devices, remote-access databases and participatory crowdsourcing. Expertise is distributed, not centralized. Governance is not just a matter of winning elections and assigning tasks to bureaucracies; it is about the skillful collection and curation of information as a way to create new affordances for commerce and social life.

Except among a small class of vanguard cities, however, the far-reaching implications of the “networked city” for economic development, urban planning, social life and democracy, have not been explored in depth. The Aspen Institute Communications and Society Program thus convened an eclectic group of thirty experts to explore how networking technologies are rapidly changing the urban landscape in nearly every dimension. The goal was to learn how open networks, onlinecooperation and open data can enhance urban planning and administration, and more broadly, how they might improve economic opportunity and civic engagement. The conference, the 24th Annual Aspen Roundtable on Information Technology, also addressed the implications of new digital technologies for urban transportation, public health and safety, and socio-economic inequality….(Download the InfoTech 2015 Report)”

Zika Emergency Puts Open Data Policies to the Test


Larry Peiperl and Peter Hotez at PLOS: “The spreading epidemic of Zika virus, with its putative and alarming associations with Guillain-Barre syndrome and infant microcephaly, has arrived just as several initiatives have come into place to minimize delays in sharing the results of scientific research.

In September 2015, in response to concerns that research publishing practices had delayed access tocrucial information in the Ebola crisis, the World Health Organization convened a consultation “[i]nrecognition of the need to streamline mechanisms of data dissemination—globally and in as close toreal-time as possible” in the context of public health emergencies.

Participating medical journal editors, representing PLOS,BMJ and Nature journals and NEJM, provided a statement that journals should not act to delay access to data in a public health emergency: “In such scenarios,journals should not penalize, and, indeed, shouldencourage or mandate public sharing of relevant data…”

In a subsequent Comment in The Lancet, authors frommajor research funding organizations expressed supportfor data sharing in public health emergencies. TheInternational Committee of Medical Journal Editors(ICMJE), meeting in November 2015, lent further support to the principles of the WHO consultation byamending ICMJE “Recommendations” to endorse data sharing for public health emergencies of anygeographic scope.

Now that WHO has declared Zika to be a Public Health Emergency of International Concern, responses from these groups in recent days appear consistent with their recent declarations.

The ICMJE has announced that “In light of the need to rapidly understand and respond to the globalemergency caused by the Zika virus, content in ICMJE journals related to Zika virus is being made freeto access. We urge other journals to do the same. Further, as stated in our Recommendations, in theevent of a public health emergency (as defined by public health officials), information with immediateimplications for public health should be disseminated without concern that this will preclude subsequentconsideration for publication in a journal.”(www.icmje.org, accessed 9 Feburary 2016)

WHO has implemented special provisions for research manuscripts relevant to the Zika epidemic thatare submitted to WHO Bulletin; such papers “will be assigned a digital object identifier and posted onlinein the “Zika Open” collection within 24 hours while undergoing peer review. The data in these papers willthus be attributed to the authors while being freely available for reader scrutiny and unrestricted use”under a Creative Commons Attribution License (CC BY IGO 3.0).

At PLOS, where open access and data sharing apply as matter of course, all PLOS journals aim toexpedite peer review evaluation, pre-publication posting, and data sharing from research relevant to theZika outbreak. PLOS Currents Outbreaks offers an online platform for rapid publication of preliminaryresults, PLOS Neglected Tropical Diseases has committed to provide priority handling of Zika reports ingeneral, and other PLOS journals will prioritize submissions within their respective scopes. The PLOSZika Collection page provides central access to relevant and continually updated content from acrossthe PLOS journals, blogs, and collaborating organizations.

Today, the Wellcome Trust has issued a statement urging journals to commit to “make all content concerning the Zika virus free to access,” and funders to “require researchers undertaking work relevant to public health emergencies to set in place mechanisms to share quality-assured interim and final data as rapidly and widely as possible, including with public health and research communities and the World Health Organisation.”  Among 31 initial signatories are such journals and publishers as PLOS, Springer Nature, Science journals, The JAMA Network, eLife, the Lancet, and New England Journal ofMedicine; and funding organizations including Bill and Melinda Gates Foundation, UK Medical ResearchCouncil,  US National Institutes of Health, Wellcome Trust, and other major national and internationalresearch funders.

This policy shift prompts reconsideration of how we publish urgently needed data during a public health emergency….(More)”

Sticky-note strategy: How federal innovation labs borrow from Silicon Valley


Carten Cordell in the Federal Times: “The framework for an integrated security solution in the Philippines is built on a bedrock of sticky notes. So is the strategy for combating piracy in East Africa and a handful of other plans that Zvika Krieger is crafting in a cauldron of collaboration within the State Department.

More specifically, Krieger, a senior adviser for strategy within the department’s Bureau of Political-Military Affairs, is working in the bureau’s Strategy Lab, just one pocket of federal government where a Silicon Valley-playbook for innovation is being used to develop policy solutions….

Krieger and a host of other policy thinkers learned a new way to channel innovation for policy solutions called human-centered design, or design thinking. While arguably new in government, the framework has long been in use by the tech sector to design products that will serve the needs of their customers. The strategy of group thinking towards a policy — which is more what these innovation labs seek to achieve — has been used before as well….Where the government has started to use HCD is in developing new policy solutions within a multifaceted group of stakeholders that can contribute a well-rounded slate of expertise. The product is a strategy that is developed from the creative thoughts of a team of experts, rather than a single specialized source….

The core tenet of HCD is to establish a meritocracy of ideas that is both empathetic of thought and immune to hierarchy. In order to get innovative solutions for a complex problem, Krieger forms a team of experts and stakeholders. He then mixes in outside thought leaders he calls “wild cards” to give the group outside perspective.

The delicate balance opens discussion and the mix of ideas ultimately form a strategy for handling the problem. That strategy might involve a technology; but it could also be a new partnership, a new function within an office, or a new acquisition program. Because the team is comprised of multiple experts, it can navigate the complexity more thoroughly, and the wild cards can offer their expertise to provide solutions the stakeholders may not have considered….

Human-centered design has been working its way through pockets of the federal government for a few years now. The Office of Personnel Management opened its Innovation Lab in 2012 and was tasked with improving the USAJobs website. The Department of Health and Human Services opened the IDEA Lab in 2013 to address innovation in its mission. The Department of Veteran Affairs has a Center of Innovation to identify new approaches to meet the current and future needs of veterans, and the departments of Defense and State both have innovation labs tackling policy solutions.

The concept is gaining momentum. This fall, the Obama administration released a strategy report calling for a network of innovation labs throughout federal agencies to develop new policy solutions through HCD.

“I think the word is spreading. It’s kind of like a whisper campaign, in the most positive way,” said an administration official with knowledge of innovation labs and HCD strategies, who was not authorized to speak to the press. “I think, again, the only constraint here is that we don’t have enough of them to be able to imbue this knowledge across government. We need many more people.”

A March 2014 GAO report said that the OPM Innovation Lab had not developed consistent performance targets that would allow it to assess the success of its projects. The report recommended more consistent milestones to assess progress, which the agency addressed through a series of pilot programs….

In the State Department’s Bureau of Educational and Cultural Affairs, an innovation lab called the Collaboratory is in its second year of existence, using HCD strategies to improve projects like the Fulbright program and other educational diplomacy efforts.

The Education Diplomacy initiative, for example, used HCD to devise ways to increase education access abroad using State resources. Defining U.S. embassies as the end user, the Collaboratory then analyzed the areas of need at the installations and began crafting policies.

“We identified a couple of area where we thought we could make substantial gains quite quickly and in a budget neutral way,” Collaboratory Deputy Director Paul Kruchoski said. The process allowed multiple stakeholders like the U.S. Agency for International Development, Peace Corps and the Department of Education to help craft the policy and create what Kruchoski called “feedback loops” to refine throughout the embassies…(More)”

 

Improving government effectiveness: lessons from Germany


Tom Gash at Global Government Forum: “All countries face their own unique challenges but advanced democracies also have much in common: the global economic downturn, aging populations, increasingly expensive health and pension spending, and citizens who remain as hard to please as ever.

At an event last week in Bavaria, attended by representatives of Bavaria’s governing party, the Christian Social Union (CSU) and their guests, it also became clear that there is a growing consensus that governments face another common problem. They have relied for too long on traditional legislation and regulation to drive change. The consensus was that simply prescribing in law what citizens and companies can and can’t do will not solve the complex problems governments are facing, that governments cannot legislate their way to improved citizen health, wealth and wellbeing….

…a number of developments …from which both UK and international policymakers and practitioners can learn to improve government effectiveness.

  1. Behavioural economics: The Behavioural Insights Team (BIT), which span out of government in 2013 and is the subject of a new book by one of its founders and former IfG Director of Research, David Halpern, is being watched carefully by many countries abroad. Some are using its services, while others – including the New South Wales Government in Australia –are building their own skills in this area. BIT and others using similar principles have shown that using insights from social psychology – alongside an experimental approach – can help save money and improve outcomes. Well known successes include increasing the tax take through changing wording of reminder letters (work led by another IfG alumni Mike Hallsworth) and increasing pension take-up through auto-enrolment.
  2. Market design: There is an emerging field of study which is examining how algorithms can be used to match people better with services they need – particularly in cases where it is unfair or morally repugnant to let allow a free market to operate. Alvin Roth, the Harvard Professor and Nobel prize winner, writes about these ‘matching markets’ in his book Who Gets What and Why – in which he also explains how the approach can ensure that more kidneys reach compatible donors, and children find the right education.
  3. Big data: Large datasets can now be mined far more effectively, whether it is to analyse crime patterns to spot where police patrols might be useful or to understand crowd flows on public transport. The use of real-time information allows far more sophisticated deployment of public sector resources, better targeted at demand and need, and better tailored to individual preferences.
  4. Transparency: Transparency has the potential to enhance both the accountability and effectiveness of governments across the world – as shown in our latest Whitehall Monitor Annual Report. The UK government is considered a world-leader for its transparency – but there are still areas where progress has stalled, including in transparency over the costs and performance of privately provided public services.
  5. New management models: There is a growing realisation that new methods are best harnessed when supported by effective management. The Institute’s work on civil service reform highlights a range of success factors from past reforms in the UK – and the benefits of clear mechanisms for setting priorities and sticking to them, as is being attempted by governments new(ish) Implementation Taskforces and the Departmental Implementation Units currently cropping up across Whitehall. I looked overseas for a different model that clearly aligns government activities behind citizens’ concerns – in this case the example of the single non-emergency number system operating in New York City and elsewhere. This system supports a powerful, highly responsive, data-driven performance management regime. But like many performance management regimes it can risk a narrow and excessively short-term focus – so such tools must be combined with the mind-set of system stewardship that the Institute has long championed in its policymaking work.
  6. Investment in new capability: It is striking that all of these developments are supported by technological change and research insights developed outside government. But to embed new approaches in government, there appear to be benefits to incubating new capacity, either in specialist departmental teams or at the centre of government….(More)”

Open government data and why it matters


Australian Government: “This was a key focus of the Prime Minister’s $1.1 billion innovation package announced this month.

The Bureau of Communications Research (BCR) today released analysis of the impact of open government data, revealing its potential to generate up to $25 billion per year, or 1.5 per cent of Australia’s GDP.

In Australia, users can already access and re-use more than 7000 government data sets published on data.gov.au,’ said Dr Paul Paterson, Chief Economist and Head of the Bureau of Communications Research (BCR).

‘Some of the high-value data sets include geospatial/mapping data, health data, transport data, mining data, environmental data, demographics data, and real-time emergency data.

‘Many Australians are unaware of the flow-on benefits from open government data as a result of the increased innovation and informed choice it creates. For example open data has the power to generate new careers, more efficient government revenues, improved business practices, and drive better public engagement,

Give Up Your Data to Cure Disease


David B. Agus in The New York Times: “How far would you go to protect your health records? Your privacy matters, of course, but consider this: Mass data can inform medicine like nothing else and save countless lives, including, perhaps, your own.

Over the past several years, using some $30 billion in federal stimulus money, doctors and hospitals have been installing electronic health record systems. ….Yet neither doctors nor patients are happy. Doctors complain about the time it takes to update digital records, while patients worry about confidentiality…

We need to get over it. These digital databases offer an incredible opportunity to examine trends that will fundamentally change how doctors treat patients. They will help develop cures, discover new uses for drugs and better track the spread of scary new illnesses like the Zika virus….

Case in point: Last year, a team led by researchers at the MD Anderson Cancer Center and Washington University found that a common class of heart drugs called beta blockers, which block the effects of adrenaline, may prolong ovarian cancer patients’ survival. This discovery came after the researchers reviewed more than 1,400 patient records, and identified an obvious pattern among those with ovarian cancer who were using beta blockers, most often to control their blood pressure. Women taking earlier versions of this class of drug typically lived for almost eight years after their cancer diagnosis, compared with just three and a half years for the women not taking any beta blocker….

We need to move past that. For one thing, more debate over data sharing is already leading to more data security. Last month a bill was signed into law calling for the Department of Health and Human Services to create a health care industry cybersecurity task force, whose members would hammer out new voluntary standards.

New technologies — and opportunities — come with unprecedented risks and the need for new policies and strategies. We must continue to improve our encryption capabilities and other methods of data security and, most important, mandate that they are used. The hack of the Anthem database last year, for instance, which allowed 80 million personal records to be accessed, was shocking not only for the break-in, but for the lack of encryption….

Medical research is making progress every day, but the next step depends less on scientists and doctors than it does on the public. Each of us has the potential to be part of tomorrow’s cures. (More)”

The Promise and Perils of Open Medical Data


Sharona Hoffman at the Hastings Center: “Not long ago I visited the Personal Genome Project’s website. The PGP describes its mission as “creating public genome, health, and trait data.” In the “Participant Profiles” section, I found several entries that disclosed the names of individuals along with their date of birth, sex, weight, height, blood type, race, health conditions, medications, allergies, medical procedures, and more. Other profiles did not feature names but provided all of the other details. I had no special access to this information. It is available to absolutely anyone with Internet access. The PGP is part of a trend known as “open data.” Many government and private entities have launched initiatives to compile very large data resources (also known as “big data”) and to make them available to the public. President Obama himself has endorsed open data by issuing a May 2013 executive order directing that, to the extent permitted by law, the federal government must release its data to the public in forms that make it easy to locate, access, and use.

Read more:http://www.thehastingscenter.org/Publications/HCR/Detail.aspx?id=7731#ixzz3zOSM2kF0

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

7 Ways Local Governments Are Getting Creative with Data Mapping


Ben Miller at GovTech:  “As government data collection expands, and as more of that data becomes publicly available, more people are looking to maps as a means of expressing the information.

And depending on the type of application, a map can be useful for both the government and its constituents. Many maps help government servants operate more efficiently and savemoney, while others will answer residents’ questions so they don’t have to call a government worker for theanswer…..

Here are seven examples of state and local governments using maps to help themselves and the people they serve.

1. DISTRICT OF COLUMBIA, IOWA GET LOCAL AND CURRENT WITH THE WEATHER

Washington%2C+D.C.+snow+plow+map

As Winter Storm Jonas was busy dropping nearly 30 inches of snow on the nation’s capital, officials in D.C. were working to clear it. And thanks to a mapping application they launched, citizens could see exactly how the city was going about that business.

The District of Columbia’s snow map lets users enter an address, and then shows what snow plows did near that address within a given range of days. The map also shows where the city received 311 requests for snow removal and gives users a chance to look at recent photos from road cameras showing driving conditions…..

2. LOS ANGELES MAPS EL NIÑO RESOURCES, TRENDS

El Niño Watch map

Throughout the winter, weather monitoring experts warned the public time and again that an El Niño system was brewing in the Pacific Ocean that looked to be one of the largest, if not the largest, ever. That would mean torrents of rain for a parched state that’s seen mudslides and flooding during storms in the past.

So to prepare its residents, the city of Los Angeles published a map in January that lets users see both decision-informing trends and the location of resources. Using the application, one can toggle layers that let them know what the weather is doing around the city, where traffic is backed up, where the power is out, where they can find sand bags to prevent flood damage and more….

3. CALIFORNIA DIVES DEEP INTO AIR POLLUTION RISKS

CalEnviroScreen

….So, faced with a legislative mandate to identify disadvantaged communities, the California Office of Environmental Health Hazard Assessment decided that it wouldn’t just examine smog levels — it also would also take a look at the prevalence of at-risk people across the state.

The result is a series of three maps, the first two examining both factors and the third combining them. That allows the state and its residents to see the places where air pollution is the biggest problem for people it poses a greater risk to….

4. STREAMLINING RESIDENT SERVICE INFORMATION

Manassas+curbside+pickup+map

The city of Manassas, Va., relied on an outdated paper map and a long-time, well-versed staffer to answer questions about municipal curbside pickup services until they launched this map in 2014. The map allows users to enter their address, and then gives them easy-to-read information about when to put out various things on their curb for pickup.

That’s useful because the city’s fall leaf collection schedule changes every year. So the map not only acts as a benefit to residents who want information, but to city staff who don’t have to deal with as many calls.

The map also shows users the locations of resources they can use and gives them city phone numbers in case they still have questions, and displays it all in a popup pane at the bottom of the map.

5. PLACING TOOLS IN THE HANDS OF THE PUBLIC

A lot of cities and counties have started publishing online maps showing city services and releasing government data.

But Chicago, Boston and Philadelphia stand out as examples of maps that take the idea one step further — because each one offers a staggering amount of choices for users.

Chicago’s new OpenGrid map, just launched in January, is a versatile map that lets users search for certain data like food inspection reports, street closures, potholes and more. That’s enough to answer a lot of questions, but what adds even more utility is the map’s various narrowing tools. Users can narrow searches to a zip code, or they can draw a shape on the map and only see results within that shape. They can perform sub-searches within results and they can choose how they’d like to see the data displayed.

Philadelphia’s platform makes use of buttons, icons and categories to help users sift through the spatially-enabled data available to them. Options include future lane closures, bicycle paths, flu shots, city resources, parks and more.

Boston’s platform is open for users to submit their own maps. And submit they have. The city portal offers everything from maps of bus stops to traffic data pulled from the Waze app.

6. HOUSTON TRANSFORMS SERVICE REQUEST DATA

Houston+311+service+request+map

A 311 service functions as a means of bringing problems to city staff’s attention. But the data itself only goes so far — it needs interpretation.

Houston’s 311 service request map helps users easily analyze the data so as to spot trends. The tool offers lots of ways to narrow data down, and can isolate many different kinds of request so users can see whether one problem is reported more often in certain areas.

7. GUIDING BUSINESS GROWTH

For the last several years, the city of Rancho Cucamonga, Calif., has been designing all sorts of maps through its Rancho Enterprise Geographic Information Systems (REGIS) project. Many of them have served specific city purposes, such as tracking code enforcement violations and offering police a command system tool for special events.

The utilitarian foundation of REGIS extends to its public-facing applications as well. One example is INsideRancho, a map built with economic development efforts in mind. The map lets users search and browse available buildings to suit business needs, narrowing results by square footage, zoning and building type. Users can also find businesses by name or address, and look at property exteriors via an embedded connection with Google Street View….(More)”

The Crusade Against Multiple Regression Analysis


Richard Nisbett at the Edge: (VIDEO) “…The thing I’m most interested in right now has become a kind of crusade against correlational statistical analysis—in particular, what’s called multiple regression analysis. Say you want to find out whether taking Vitamin E is associated with lower prostate cancer risk. You look at the correlational evidence and indeed it turns out that men who take Vitamin E have lower risk for prostate cancer. Then someone says, “Well, let’s see if we do the actual experiment, what happens.” And what happens when you do the experiment is that Vitamin E contributes to the likelihood of prostate cancer. How could there be differences? These happen a lot. The correlational—the observational—evidence tells you one thing, the experimental evidence tells you something completely different.

In the case of health data, the big problem is something that’s come to be called the healthy user bias, because the guy who’s taking Vitamin E is also doing everything else right. A doctor or an article has told him to take Vitamin E, so he does that, but he’s also the guy who’s watching his weight and his cholesterol, gets plenty of exercise, drinks alcohol in moderation, doesn’t smoke, has a high level of education, and a high income. All of these things are likely to make you live longer, to make you less subject to morbidity and mortality risks of all kinds. You pull one thing out of that correlate and it’s going to look like Vitamin E is terrific because it’s dragging all these other good things along with it.

This is not, by any means, limited to health issues. A while back, I read a government report in The New York Times on the safety of automobiles. The measure that they used was the deaths per million drivers of each of these autos. It turns out that, for example, there are enormously more deaths per million drivers who drive Ford F150 pickups than for people who drive Volvo station wagons. Most people’s reaction, and certainly my initial reaction to it was, “Well, it sort of figures—everybody knows that Volvos are safe.”

Let’s describe two people and you tell me who you think is more likely to be driving the Volvo and who is more likely to be driving the pickup: a suburban matron in the New York area and a twenty-five-year-old cowboy in Oklahoma. It’s obvious that people are not assigned their cars. We don’t say, “Billy, you’ll be driving a powder blue Volvo station wagon.” Because of this self-selection problem, you simply can’t interpret data like that. You know virtually nothing about the relative safety of cars based on that study.

I saw in The New York Times recently an article by a respected writer reporting that people who have elaborate weddings tend to have marriages that last longer. How would that be? Maybe it’s just all the darned expense and bother—you don’t want to get divorced. It’s a cognitive dissonance thing.

Let’s think about who makes elaborate plans for expensive weddings: people who are better off financially, which is by itself a good prognosis for marriage; people who are more educated, also a better prognosis; people who are richer; people who are older—the later you get married, the more likelihood that the marriage will last, and so on.

The truth is you’ve learned nothing. It’s like saying men who are a somebody III or IV have longer-lasting marriages. Is it because of the suffix there? No, it’s because those people are the types who have a good prognosis for a lengthy marriage.

A huge range of science projects are done with multiple regression analysis. The results are often somewhere between meaningless and quite damaging….(More)