Direct democracy may be key to a happier American democracy


 and in the Conversation: “Is American democracy still “by the people, for the people?” According to recent research, it may not be. Martin Gilens at Princeton University confirms that the wishes of the American working and middle class play essentially no role in our nation’s policy making. A BBC story rightly summarized this with the headline: US Is an Oligarchy, Not a Democracy.

However new research by Benjamin Radcliff and Gregory Shufeldt suggests a ray of hope.

Ballot initiatives, they argue, may better serve the interests of ordinary Americans than laws passed by elected officials….

Today, 24 states allow citizens to directly vote on policy matters.

This year, more than 42 initiatives already are approved for the ballot in 18 states.

Voters in California will decide diverse questions including banning plastic bags, voter approval of state expenses greater than US$2 billion dollars, improving school funding, and the future of bilingual education.

The people of Colorado will vote on replacing their current medical insurance programs with a single payer system, and in Massachusetts people may consider legalizing recreational marijuana….

However, many have pointed to problems with direct democracy in the form of ballot initiatives.

Maxwell Sterns at the University of Maryland, for example, writes that legislatures are better because initiatives are the tools of special interests and minorities. In the end, initiatives are voted upon by an unrepresentative subset of the population, Sterns concludes.

Others like Richard Ellis of Willamette University argue that the time-consuming process of gathering signatures introduces a bias toward moneyed interests. Some suggest this has damaged direct democracy in California, where professional petition writers andpaid signature gatherers dominate the process. Moneyed interests also enjoy a natural advantage in having the resources that ordinary people lack to mount media campaigns to support their narrow interests.

To curb this kind of problem, bans on paying people per signature are proposed in many states, but have not yet passed any legislature. However, because Californians like direct democracy in principle, they have recently amended the process to allow for a review and revision, and they require mandatory disclosures about the funding and origins of ballot initiatives.

Finally, some say initiatives can be confusing for voters, like the two recent Ohio propositions concerning marijuana, where one ballot proposition essentially canceled out the other. Similarly, Mississippi’s Initiative 42 required marking the ballot in two places for approval but only one for disapproval, resulting in numerous nullified “yes” votes.

Routes to happiness

Despite these flaws, our research shows that direct democracy might improve happiness in two ways.

One is through its psychological effect on voters, making them feel they have a direct impact on policy outcomes. This holds even if they may not like, and thus vote against, a particular proposition. The second is that it may indeed produce policies more consistent with human well being.

The psychological benefits are obvious. By allowing people literally to be the government, just as in ancient Athens, people develop higher levels of political efficacy. In short, they may feel they have some control over their lives. Direct democracy can give people political capital because it offers a means by which citizens may place issues on the ballot for popular vote, giving them an opportunity both to set the agenda and to vote on the outcome.

We think this is important today given America’s declining faith in government. Overall today only 19 percent believe the government is run for all citizens. The same percentage trusts government to mostly do what is right. The poor and working classes are even more alienated….(More)”

Open Data Is Changing the World in Four Ways…


 at The GovLab Blog: “New repository of case studies documents the impact of open data globally: odimpact.org.

odimpact-tweet-3

Despite global commitments to and increasing enthusiasm for open data, little is actually known about its use and impact. What kinds of social and economic transformation has open data brought about, and what is its future potential? How—and under what circumstances—has it been most effective? How have open data practitioners mitigated risks and maximized social good?

Even as proponents of open data extol its virtues, the field continues to suffer from a paucity of empiricalevidence. This limits our understanding of open data and its impact.

Over the last few months, The GovLab (@thegovlab), in collaboration with Omidyar Network(@OmidyarNetwork), has worked to address these shortcomings by developing 19 detailed open data case studies from around the world. The case studies have been selected for their sectoral and geographic representativeness. They are built in part from secondary sources (“desk research”), and also from more than60 first-hand interviews with important players and key stakeholders. In a related collaboration withOmidyar Network, Becky Hogge(@barefoot_techie), an independent researcher, has developed an additional six open data case studies, all focused on the United Kingdom.  Together, these case studies, seek to provide a more nuanced understanding of the various processes and factors underlying the demand, supply, release, use and impact of open data.

Today, after receiving and integrating comments from dozens of peer reviewers through a unique open process, we are delighted to share an initial batch of 10 case studies, as well three of Hogge’s UK-based stories. These are being made available at a new custom-built repository, Open Data’s Impact (http://odimpact.org), that will eventually house all the case studies, key findings across the studies, and additional resources related to the impact of open data. All this information will be stored in machine-readable HTML and PDF format, and will be searchable by area of impact, sector and region….(More)

Design-Led Innovation in the Public Sector


Manuel Sosa at INSEAD Knowledge: “When entering a government permit office, virtually everyone would prepare themselves for a certain amount of boredom and confusion. But resignation may well turn to surprise or even shock, if that office is Singapore’s Employment Pass Service Centre (EPSC), where foreign professionals go to receive their visa to work in the city-state. The ambience more closely resembles a luxury hotel lobby than a grim government agency, an impression reinforced by the roaming reception managers who greet arriving applicants, directing them to a waiting area with upholstered chairs and skyline views.

In a new case study, “Designing the Employment Pass Service Centre for the Ministry of Manpower, Singapore”, Prof. Michael Pich and I explore how even public organizations are beginning to use design to find and tap into innovation opportunities where few have thought to look. In the case of Singapore’s Ministry of Manpower (MOM), a design-led transformation of a single facility was the starting point of a drastic reconsideration of what a government agency could be.

Efficiency is not enough

Prior to opening the EPSC in July 2009, MOM’s Work Pass Division (WPD) had developed hyper-efficient methods to process work permits for foreign workers, who comprise approximately 40 percent of Singapore’s workforce. In fact, it was generally considered the most efficient department of its kind in the world. After 9/11, a mandatory-fingerprinting policy for white-collar workers was introduced, necessitating a standalone centre. The agency saw this as an opportunity to raise the efficiency bar even further.

Giving careful consideration to every aspect of the permit-granting process, the project team worked with a local vendor to overhaul the existing model. The proposal they ultimately presented to MOM assured almost unheard-of waiting times, as well as a more aesthetically pleasing look and feel….

Most public-sector organisations’ prickly interactions with the public can be explained with the simple fact that they lack competition. Government bodies are generally monopolies dispensing necessities, so on the whole they don’t feel compelled to agonise over their public face.

MOM and the Singapore government had a different idea. Aware that they were competing with other countries for top global talent, they recognised that the permit-granting process, in a very real sense, set the tone for foreign professionals’ entire experience of Singapore. Expats would be unlikely to remember precisely how long it took to get processed, but the quality of the service received would resonate in their minds and affect their impression of the country as a whole.

IDEO typically begins by concentrating on the user experience. In this case, in addition to observing and identifying what goes through the mind of a typical applicant during his or her journey in the existing system, the observation stage included talking to foreigners who were arriving in Singapore about their experience. IDEO discovered that professionals newly arrived in Singapore were embarking on an entirely new chapter of their lives, with all the expected stresses. The last thing they needed was more stress when receiving their permit. Hence, the EPSC entry hall is airy and free of clutter to create a sense of calm. The ESPC provides toys to keep kids entertained while their parents meet with agents and register for work passes. Visitors are always called by name, not number. Intimidating interview rooms were done away with in favour of open cabanas….In its initial customer satisfaction survey in 2010, the EPSC scored an average rating of 5.7 out of 6….(More)”

Big-data analytics: the power of prediction


Rachel Willcox in Public Finance: “The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead…

Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time….

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says….

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety….(More)”

 

Understanding Participatory Governance


An analysis of “Participants’ Motives for Participation” by Per Gustafson and Nils Hertting: “Despite the growing body of literature on participatory and collaborative governance, little is known about citizens’ motives for participation in such new governance arrangements. The present article argues that knowledge about these motives is essential for understanding the quality and nature of participatory governance and its potential contribution to the overall political and administrative system.

Survey data were used to explore participants’ motives for participating in a large-scale urban renewal program in Stockholm, Sweden. The program was neighborhood-based, characterized by self-selected and repeated participation, and designed to influence local decisions on the use of public resources.

Three types of motives were identified among the participants: (a) Common good motives concerned improving the neighborhood in general and contributing knowledge and competence. (b) Self-interest motives reflected a desire to improve one’s own political efficacy and to promote the interest of one’s own group or family. (c) Professional competence motives represented a largely apolitical type of motive, often based on a professional role. Different motives were expressed by different categories of participants and were also associated with different perceptions concerning program outcomes.

Further analysis suggested that participatory governance may represent both an opportunity for marginalized groups to empower themselves and an opportunity for more privileged groups to act as local “citizen representatives” and articulate the interests of their neighborhoods. These findings call for a more complex understanding of the role and potential benefits of participatory governance…(More).”

 

Core Concepts: Computational social science


Adam Mann at PNAS:Cell phone tower data predicts which parts of London can expect a spike in crime (1). Google searches for polling place information on the day of an election reveal the consequences of different voter registration laws (2). Mathematical models explain how interactions among financial investors produce better yields, and even how they generate economic bubbles (3).

Figure

Using cell-phone and taxi GPS data, researchers classified people in San Francisco into “tribal networks,” clustering them according to their behavioral patterns. Student’s, tourists, and businesspeople all travel through the city in various ways, congregating and socializing in different neighborhoods. Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

Figure

Where people hail from in the Mexico City area, here indicated by different colors, feeds into a crime-prediction model devised by Alex Pentland and colleagues (6). Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

 These are just a few examples of how a suite of technologies is helping bring sociology, political science, and economics into the digital age. Such social science fields have historically relied on interviews and survey data, as well as censuses and other government databases, to answer important questions about human behavior. These tools often produce results based on individuals—showing, for example, that a wealthy, well-educated, white person is statistically more likely to vote (4)—but struggle to deal with complex situations involving the interactions of many different people.

 

A growing field called “computational social science” is now using digital tools to analyze the rich and interactive lives we lead. The discipline uses powerful computer simulations of networks, data collected from cell phones and online social networks, and online experiments involving hundreds of thousands of individuals to answer questions that were previously impossible to investigate. Humans are fundamentally social creatures and these new tools and huge datasets are giving social scientists insights into exactly how connections among people create societal trends or heretofore undetected patterns, related to everything from crime to economic fortunes to political persuasions. Although the field provides powerful ways to study the world, it’s an ongoing challenge to ensure that researchers collect and store the requisite information safely, and that they and others use that information ethically….(More)”

Democracy Dashboard


The Brookings Democracy Dashboard is a collection of data designed to help users evaluate political system and governmental performance in the United States. The Democracy Dashboard displays trends in democracy and governance in seven key areas: Elections administration; democratic participation and voting; public opinion; institutional functioning in the executive, legislative, and judicial branches; and media capacity.

The dashboard—and accompanying analyses on the FixGov blog—provide information that can help efforts tScreen Shot 2016-01-27 at 2.01.03 PMo strengthen democracy and improve governance in the U.S.

Data will be released on a rolling basis during 2016 and expanded in future election years. Scroll through the interactive charts below to explore data points and trends in key areas for midterm and presidential elections and/or download the data in Excel format here »….(More)”

 

Yahoo Releases the Largest-ever Machine Learning Dataset for Researchers


Suju Rajan at Yahoo Labs: “Data is the lifeblood of research in machine learning. However, access to truly large-scale datasets is a privilege that has been traditionally reserved for machine learning researchers and data scientists working at large companies – and out of reach for most academic researchers.

Research scientists at Yahoo Labs have long enjoyed working on large-scale machine learning problems inspired by consumer-facing products. This has enabled us to advance the thinking in areas such as search ranking, computational advertising, information retrieval, and core machine learning. A key aspect of interest to the external research community has been the application of new algorithms and methodologies to production traffic and to large-scale datasets gathered from real products.

Today, we are proud to announce the public release of the largest-ever machine learning dataset to the research community. The dataset stands at a massive ~110B events (13.5TB uncompressed) of anonymized user-news item interaction data, collected by recording the user-news item interactions of about 20M users from February 2015 to May 2015.

The Yahoo News Feed dataset is a collection based on a sample of anonymized user interactions on the news feeds of several Yahoo properties, including the Yahoo homepage, Yahoo News, Yahoo Sports, Yahoo Finance, Yahoo Movies, and Yahoo Real Estate.

Our goals are to promote independent research in the fields of large-scale machine learning and recommender systems, and to help level the playing field between industrial and academic research. The dataset is available as part of the Yahoo Labs Webscope data-sharing program, which is a reference library of scientifically-useful datasets comprising anonymized user data for non-commercial use.

In addition to the interaction data, we are providing categorized demographic information (age range, gender, and generalized geographic data) for a subset of the anonymized users. On the item side, we are releasing the title, summary, and key-phrases of the pertinent news article. The interaction data is timestamped with the relevant local time and also contains partial information about the device on which the user accessed the news feeds, which allows for interesting work in contextual recommendation and temporal data mining….(More)”

7 Ways Local Governments Are Getting Creative with Data Mapping


Ben Miller at GovTech:  “As government data collection expands, and as more of that data becomes publicly available, more people are looking to maps as a means of expressing the information.

And depending on the type of application, a map can be useful for both the government and its constituents. Many maps help government servants operate more efficiently and savemoney, while others will answer residents’ questions so they don’t have to call a government worker for theanswer…..

Here are seven examples of state and local governments using maps to help themselves and the people they serve.

1. DISTRICT OF COLUMBIA, IOWA GET LOCAL AND CURRENT WITH THE WEATHER

Washington%2C+D.C.+snow+plow+map

As Winter Storm Jonas was busy dropping nearly 30 inches of snow on the nation’s capital, officials in D.C. were working to clear it. And thanks to a mapping application they launched, citizens could see exactly how the city was going about that business.

The District of Columbia’s snow map lets users enter an address, and then shows what snow plows did near that address within a given range of days. The map also shows where the city received 311 requests for snow removal and gives users a chance to look at recent photos from road cameras showing driving conditions…..

2. LOS ANGELES MAPS EL NIÑO RESOURCES, TRENDS

El Niño Watch map

Throughout the winter, weather monitoring experts warned the public time and again that an El Niño system was brewing in the Pacific Ocean that looked to be one of the largest, if not the largest, ever. That would mean torrents of rain for a parched state that’s seen mudslides and flooding during storms in the past.

So to prepare its residents, the city of Los Angeles published a map in January that lets users see both decision-informing trends and the location of resources. Using the application, one can toggle layers that let them know what the weather is doing around the city, where traffic is backed up, where the power is out, where they can find sand bags to prevent flood damage and more….

3. CALIFORNIA DIVES DEEP INTO AIR POLLUTION RISKS

CalEnviroScreen

….So, faced with a legislative mandate to identify disadvantaged communities, the California Office of Environmental Health Hazard Assessment decided that it wouldn’t just examine smog levels — it also would also take a look at the prevalence of at-risk people across the state.

The result is a series of three maps, the first two examining both factors and the third combining them. That allows the state and its residents to see the places where air pollution is the biggest problem for people it poses a greater risk to….

4. STREAMLINING RESIDENT SERVICE INFORMATION

Manassas+curbside+pickup+map

The city of Manassas, Va., relied on an outdated paper map and a long-time, well-versed staffer to answer questions about municipal curbside pickup services until they launched this map in 2014. The map allows users to enter their address, and then gives them easy-to-read information about when to put out various things on their curb for pickup.

That’s useful because the city’s fall leaf collection schedule changes every year. So the map not only acts as a benefit to residents who want information, but to city staff who don’t have to deal with as many calls.

The map also shows users the locations of resources they can use and gives them city phone numbers in case they still have questions, and displays it all in a popup pane at the bottom of the map.

5. PLACING TOOLS IN THE HANDS OF THE PUBLIC

A lot of cities and counties have started publishing online maps showing city services and releasing government data.

But Chicago, Boston and Philadelphia stand out as examples of maps that take the idea one step further — because each one offers a staggering amount of choices for users.

Chicago’s new OpenGrid map, just launched in January, is a versatile map that lets users search for certain data like food inspection reports, street closures, potholes and more. That’s enough to answer a lot of questions, but what adds even more utility is the map’s various narrowing tools. Users can narrow searches to a zip code, or they can draw a shape on the map and only see results within that shape. They can perform sub-searches within results and they can choose how they’d like to see the data displayed.

Philadelphia’s platform makes use of buttons, icons and categories to help users sift through the spatially-enabled data available to them. Options include future lane closures, bicycle paths, flu shots, city resources, parks and more.

Boston’s platform is open for users to submit their own maps. And submit they have. The city portal offers everything from maps of bus stops to traffic data pulled from the Waze app.

6. HOUSTON TRANSFORMS SERVICE REQUEST DATA

Houston+311+service+request+map

A 311 service functions as a means of bringing problems to city staff’s attention. But the data itself only goes so far — it needs interpretation.

Houston’s 311 service request map helps users easily analyze the data so as to spot trends. The tool offers lots of ways to narrow data down, and can isolate many different kinds of request so users can see whether one problem is reported more often in certain areas.

7. GUIDING BUSINESS GROWTH

For the last several years, the city of Rancho Cucamonga, Calif., has been designing all sorts of maps through its Rancho Enterprise Geographic Information Systems (REGIS) project. Many of them have served specific city purposes, such as tracking code enforcement violations and offering police a command system tool for special events.

The utilitarian foundation of REGIS extends to its public-facing applications as well. One example is INsideRancho, a map built with economic development efforts in mind. The map lets users search and browse available buildings to suit business needs, narrowing results by square footage, zoning and building type. Users can also find businesses by name or address, and look at property exteriors via an embedded connection with Google Street View….(More)”

The Crusade Against Multiple Regression Analysis


Richard Nisbett at the Edge: (VIDEO) “…The thing I’m most interested in right now has become a kind of crusade against correlational statistical analysis—in particular, what’s called multiple regression analysis. Say you want to find out whether taking Vitamin E is associated with lower prostate cancer risk. You look at the correlational evidence and indeed it turns out that men who take Vitamin E have lower risk for prostate cancer. Then someone says, “Well, let’s see if we do the actual experiment, what happens.” And what happens when you do the experiment is that Vitamin E contributes to the likelihood of prostate cancer. How could there be differences? These happen a lot. The correlational—the observational—evidence tells you one thing, the experimental evidence tells you something completely different.

In the case of health data, the big problem is something that’s come to be called the healthy user bias, because the guy who’s taking Vitamin E is also doing everything else right. A doctor or an article has told him to take Vitamin E, so he does that, but he’s also the guy who’s watching his weight and his cholesterol, gets plenty of exercise, drinks alcohol in moderation, doesn’t smoke, has a high level of education, and a high income. All of these things are likely to make you live longer, to make you less subject to morbidity and mortality risks of all kinds. You pull one thing out of that correlate and it’s going to look like Vitamin E is terrific because it’s dragging all these other good things along with it.

This is not, by any means, limited to health issues. A while back, I read a government report in The New York Times on the safety of automobiles. The measure that they used was the deaths per million drivers of each of these autos. It turns out that, for example, there are enormously more deaths per million drivers who drive Ford F150 pickups than for people who drive Volvo station wagons. Most people’s reaction, and certainly my initial reaction to it was, “Well, it sort of figures—everybody knows that Volvos are safe.”

Let’s describe two people and you tell me who you think is more likely to be driving the Volvo and who is more likely to be driving the pickup: a suburban matron in the New York area and a twenty-five-year-old cowboy in Oklahoma. It’s obvious that people are not assigned their cars. We don’t say, “Billy, you’ll be driving a powder blue Volvo station wagon.” Because of this self-selection problem, you simply can’t interpret data like that. You know virtually nothing about the relative safety of cars based on that study.

I saw in The New York Times recently an article by a respected writer reporting that people who have elaborate weddings tend to have marriages that last longer. How would that be? Maybe it’s just all the darned expense and bother—you don’t want to get divorced. It’s a cognitive dissonance thing.

Let’s think about who makes elaborate plans for expensive weddings: people who are better off financially, which is by itself a good prognosis for marriage; people who are more educated, also a better prognosis; people who are richer; people who are older—the later you get married, the more likelihood that the marriage will last, and so on.

The truth is you’ve learned nothing. It’s like saying men who are a somebody III or IV have longer-lasting marriages. Is it because of the suffix there? No, it’s because those people are the types who have a good prognosis for a lengthy marriage.

A huge range of science projects are done with multiple regression analysis. The results are often somewhere between meaningless and quite damaging….(More)