Mindless – Why Smarter Machines are Making Dumber Humans


New book by Simon Head: “The tools of corporate efficiency—expert systems, databases, and operations management—have improved our lives significantly, but with a cost: they’re turning us into mindless drones.
We live in the age of Computer Business Systems (CBSs)—the highly complex, computer-intensive management programs on which large organizations increasingly rely. In Mindless, Simon Head argues that these systems have come to trump human expertise, dictating the goals and strategies of a wide array of businesses, and de-skilling the jobs of middle class workers in the process. CBSs are especially dysfunctional, Head argues, when they apply their disembodied expertise to transactions between humans, as in health care, education, customer relations, and human resources management. And yet there are industries with more human approaches, as Head illustrates with specific examples, whose lead we must follow and extend to the mainstream American economy.
Mindless illustrates the shortcomings of CBS, providing an in-depth and disturbing look at how human dignity is slipping as we become cogs on a white collar assembly line.”

Can Twitter Predict Major Events Such As Mass Protests?


Emerging Technology From the arXiv : “The idea that social media sites such as Twitter can predict the future has a controversial history. In the last few years, various groups have claimed to be able to predict everything from the outcome of elections to the box office takings for new movies.
It’s fair to say that these claims have generated their fair share of criticism. So it’s interesting to see a new claim come to light.
Today, Nathan Kallus at the Massachusetts Institute of Technology in Cambridge says he has developed a way to predict crowd behaviour using statements made on Twitter. In particular, he has analysed the tweets associated with the 2013 coup d’état in Egypt and says that the civil unrest associated with this event was clearly predictable days in advance.
It’s not hard to imagine how the future behaviour of crowds might be embedded in the Twitter stream. People often signal their intent to meet in advance and even coordinate their behaviour using social media. So this social media activity is a leading indicator of future crowd behaviour.
That makes it seem clear that predicting future crowd behaviour is simply a matter of picking this leading indicator out of the noise.
Kallus says this is possible by mining tweets for any mention of future events and then analysing trends associated with them. “The gathering of crowds into a single action can often be seen through trends appearing in this data far in advance,” he says.
It turns out that exactly this kind of analysis is available from a company called Recorded Future based in Cambridge, which scans 300,000 different web sources in seven different languages from all over the world. It then extracts mentions of future events for later analysis….
The bigger question is whether it’s possible to pick out this evidence in advance. In other words, is possible to make predictions before the events actually occur?
That’s not so clear but there are good reasons to be cautious. First of all, while it’s possible to correlate Twitter activity to real protests, it’s also necessary to rule out false positives. There may be significant Twitter trends that do not lead to significant protests in the streets. Kallus does not adequately address the question of how to tell these things apart.
Then there is the question of whether tweets are trustworthy. It’s not hard to imagine that when it comes to issues of great national consequence, propaganda, rumor and irony may play a significant role. So how to deal with this?
There is also the question of demographics and whether tweets truly represent the intentions and activity of the population as a whole. People who tweet are overwhelmingly likely to be young but there is another silent majority that plays hugely important role. So can the Twitter firehose really represent the intentions of this part of the population too?
The final challenge is in the nature of prediction. If the Twitter feed is predictive, then what’s needed is evidence that it can be used to make real predictions about the future and not just historical predictions about the past.
We’ve looked at some of these problems with the predictive power of social media before and the challenge is clear: if there is a claim to be able to predict the future, then this claim must be accompanied by convincing evidence of an actual prediction about an event before it happens.
Until then, it would surely be wise to be circumspect about the predictive powers of Twitter and other forms of social media.
Ref: arxiv.org/abs/1402.2308: Predicting Crowd Behavior with Big Public Data”

Developing an open government plan in the open


Tim Hughes at OGP: “New laws, standards, policies, processes and technologies are critical for opening up government, but arguably just as (if not more) important are new cultures, behaviours and ways of working within government and civil society.
The development of an OGP National Action Plan, therefore, presents a twofold opportunity for opening up government: On the one hand it should be used to deliver a set of robust and ambitious commitments to greater transparency, participation and accountability. But just as importantly, the process of developing a NAP should also be used to model new forms of open and collaborative working within government and civil society. These two purposes of a NAP should be mutually reinforcing. An open and collaborative process can – as was the case in the UK – help to deliver a more robust and ambitious action plan, which in turn can demonstrate the efficacy of working in the open.
You could even go one step further to say that the development of an National Action Plan should present an (almost) “ideal” vision of what open government in a country could look like. If governments aren’t being open as they’re developing an open government action plan, then there’s arguably little hope that they’ll be open elsewhere.
As coordinators of the UK OGP civil society network, this was on our mind at the beginning and throughout the development of the UK’s 2013-15 National Action Plan. Crucially, it was also on the minds of our counterparts in the UK Government. From the start, therefore, the process was developed with the intention that it should itself model the principles of open government. Members of the UK OGP civil society network met with policy officials from the UK Government on a regular basis to scope out and develop the action plan, and we published regular updates of our discussions and progress for others to follow and engage with. The process wasn’t without its challenges – and there’s still much more we can do to open it up further in the future – but it was successful in moving far beyond the typical model of government deciding, announcing and defending its intentions and in delivering an action plan with some strong and ambitious commitments.
One of the benefits of working in an open and collaborative way is that it enabled us to conduct and publish a full – warts and all – review of what went well and what didn’t. So, consider this is an invitation to delve into our successes and failures, a challenge to do it better and a request to help us to do so too. Head over to the UK OGP civil society network blog to read about what we did, and tell us what you think: http://www.opengovernment.org.uk/national-action-plan/story-of-the-uk-national-action-plan-2013-15/

Choosing Not to Choose


New paper by Cass Sunstein: “Choice can be an extraordinary benefit or an immense burden. In some contexts, people choose not to choose, or would do so if they were asked. For example, many people prefer not to make choices about their health or retirement plans; they want to delegate those choices to a private or public institution that they trust (and may well be willing to pay a considerable amount for such delegations). This point suggests that however well-accepted, the line between active choosing and paternalism is often illusory. When private or public institutions override people’s desire not to choose, and insist on active choosing, they may well be behaving paternalistically, through a form of choice-requiring paternalism. Active choosing can be seen as a form of libertarian paternalism, and a frequently attractive one, if people are permitted to opt out of choosing in favor of a default (and in that sense not to choose); it is a form of nonlibertarian paternalism insofar as people are required to choose. For both ordinary people and private or public institutions, the ultimate judgment in favor of active choosing, or in favor of choosing not to choose, depends largely on the costs of decisions and the costs of errors. But the value of learning, and of developing one’s own preferences and values, is also important, and may argue on behalf of active choosing, and against the choice not to choose. For law and policy, these points raise intriguing puzzles about the idea of “predictive shopping,” which is increasingly feasible with the rise of large data sets containing information about people’s previous choices. Some empirical results are presented about people’s reactions to predictive shopping; the central message is that most (but not all) people reject predictive shopping in favor of active choosing.”

Structuring Big Data to Facilitate Democratic Participation in International Law


New paper by Roslyn Fuller: “This is an interdisciplinary article focusing on the interplay between information and communication technology (ICT) and international law (IL). Its purpose is to open up a dialogue between ICT and IL practitioners that focuses on the ways in which ICT can enhance equitable participation in international legal structures, particularly through capturing the possibilities associated with big data. This depends on the ability of individuals to access big data, for it to be structured in a manner that makes it accessible and for the individual to be able to take action based on it.”

LocalWiki turns open local data into open local knowledge


Marina Kukso at OpenGovVoices:” LocalWiki is an open knowledge project focusing on giving everyone the opportunity to collaborate to create and share all kinds of information about the place where they live.

The project started in 2004 in Davis, Calif. as the Davis Wiki, now the primary local information resource for Davis residents. One-in-seven residents have contributed to the project and, in a given month, almost every resident uses it.

In 2010, we received funding from the Knight Foundation to bring LocalWiki to many more communities. We created a wiki software specifically designed for local collaboration and have seen adoption in more than 70 communities worldwide. People now use LocalWiki for everything from mapping out nature trails to planning a grassroots mayoral election candidate debate….

There’s a great deal of expertise within our communities, and at LocalWiki we see part of the mission of our work as providing a platform for people to contextualize and make meaning out of the information made available through open data and open gov efforts at the local level.

There are obviously limitations to the ability of programming laypeople to make use of open data to create new knowledge to drive action, most notably many people’s lack of expertise in data analysis, but with LocalWiki we hope to at least address some of those limitations by making it significantly easier for people to collaborate to create meaning out of open data and to share it with others. This is why LocalWiki has a wysiwyg editor, which includes mapping as a core feature and prioritizes usability in design.

Finally, adding information about a community on LocalWiki is a way to create new open data. It’s incredibly important to make things like internal city crime statistics public, but residents’ perspectives on the relative safety of their neighborhoods is a different kind of data that provides additional insights into public safety challenges and adds complexity to the picture created by statistics.”

11 ways to rethink open data and make it relevant to the public


Miguel Paz at IJNET: “It’s time to transform open data from a trendy concept among policy wonks and news nerds into something tangible to everyday life for citizens, businesses and grassroots organizations. Here are some ideas to help us get there:
1. Improve access to data
Craig Hammer from the World Bank has tackled this issue, stating that “Open Data could be the game changer when it comes to eradicating global poverty”, but only if governments make available online data that become actionable intelligence: a launch pad for investigation, analysis, triangulation, and improved decision making at all levels.
2. Create open data for the end user
As Hammer wrote in a blog post for the Harvard Business Review, while the “opening” has generated excitement from development experts, donors, several government champions, and the increasingly mighty geek community, the hard reality is that much of the public has been left behind, or tacked on as an afterthought. Let`s get out of the building and start working for the end user.
3. Show, don’t tell
Regular folks don’t know what “open data” means. Actually, they probably don’t care what we call it and don’t know if they need it. Apple’s Steve Jobs said that a lot of times, people don’t know what they want until you show it to them. We need to stop telling them they need it and start showing them why they need it, through actionable user experience.
4. Make it relevant to people’s daily lives, not just to NGOs and policymakers’ priorities
A study of the use of open data and transparency in Chile showed the top 10 uses were for things that affect their lives directly for better or for worse: data on government subsidies and support, legal certificates, information services, paperwork. If the data doesn’t speak to priorities at the household or individual level, we’ve lost the value of both the “opening” of data, and the data itself.
5. Invite the public into the sandbox
We need to give people “better tools to not only consume, but to create and manipulate data,” says my colleague Alvaro Graves, Poderopedia’s semantic web developer and researcher. This is what Code for America does, and it’s also what happened with the advent of Web 2.0, when the availability of better tools, such as blogging platforms, helped people create and share content.
6. Realize that open data are like QR codes
Everyone talks about open data the way they used to talk about QR codes–as something ground breaking. But as with QR Codes, open data only succeeds with the proper context to satisfy the needs of citizens. Context is the most important thing to funnel use and success of open data as a tool for global change.
7. Make open data sexy and pop, like Jess3.com
Geeks became popular because they made useful and cool things that could be embraced by end users. Open data geeks need to stick with that program.
8. Help journalists embrace open data
Jorge Lanata, a famous Argentinian journalist who is now being targeted by the Cristina Fernández administration due to his unfolding of government corruption scandals, once said that 50 percent of the success of a story or newspaper is assured if journalists like it.
That’s true of open data as well. If journalists understand its value for the public interest and learn how to use it, so will the public. And if they do, the winds of change will blow. Governments and the private sector will be forced to provide better, more up-to-date and standardized data. Open data will be understood not as a concept but as a public information source as relevant as any other. We need to teach Latin American journalists to be part of this.
9. News nerds can help you put your open data to good use
In order to boost the use of open data by journalists we need news nerds, teams of lightweight and tech-heavy armored journalist-programmers who can teach colleagues how open data through brings us high-impact storytelling that can change public policies and hold authorities accountable.
News nerds can also help us with “institutionalizing data literacy across societies” as Hammer puts it. ICFJ Knight International Journalism Fellow and digital strategist Justin Arenstein calls these folks “mass mobilizers” of information. Alex Howard “points to these groups because they can help demystify data, to make it understandable by populations and not just statisticians.”
I call them News Ninja Nerds, accelerator taskforces that can foster innovationsin news, data and transparency in a speedy way, saving governments and organizations time and a lot of money. Projects like ProPublica’s Dollars For Docs are great examples of what can be achieved if you mix FOIA, open data and the will to provide news in the public interest.
10. Rename open data
Part of the reasons people don’t embrace concepts such as open data is because it is part of a lingo that has nothing to do with them. No empathy involved. Let’s start talking about people’s right to know and use the data generated by governments. As Tim O’Reilly puts it: “Government as a Platform for Greatness,” with examples we can relate to, instead of dead .PDF’s and dirty databases.
11. Don’t expect open data to substitute for thinking or reporting
Investigative Reporting can benefit from it. But “but there is no substitute for the kind of street-level digging, personal interviews, and detective work” great journalism projects entailed, says David Kaplan in a great post entitled, Why Open Data is Not Enough.”

Three ways digital leaders can operate successfully in local government


in The Guardian: “The landscape of digital is constantly changing and being redefined with every new development, technology breakthrough, success and failure. We need digital public sector leaders who can properly navigate this environment, and follow these three guidelines.
1. Champion open data
We need leaders who can ensure that information and data is open by default, and secure when absolutely required. Too often councils commission digital programmes only to find the data generated does not easily integrate with other systems, or that data is not council-owned and can only be accessed at further cost.
2. Don’t get distracted by flashy products
Leaders must adopt an agnostic approach to technology, and not get seduced by the ever-increasing number of digital technologies and lose sight of real user and business needs.
3. Learn from research
Tales of misplaced IT investments plague the public sector, and senior leaders are understandably hesitant when considering future investments. To avoid causing even more disruption, we should learn from research findings such as those of the New Local Government Network’s recent digital roundtables on what works.
Making the decision to properly invest in digital leadership will not just improve decision making about digital solutions and strategies. It will also bring in the knowledge needed to navigate the complex security requirements that surround public-sector IT. And it will ensure that practices honed in the digital environment become embedded in the council more generally.
In Devon, for example, we are making sure all the services we offer online are based on the experience and behaviour of users. This has led service teams to refocus on the needs of citizens rather than those of the organisation. And our experiences of future proofing, agility and responsiveness are informing service design throughout the council.
What’s holding us back?
Across local government there is still a fragmented approach to collaboration. In central government, the Government Digital Service is charged with providing the right environment for change across all government departments. However, in local government, digital leaders often work alone without a unifying strategy across the sector. It is important to understand and recognise that the Government Digital Service is more than just a team pushing and promoting digital in central government: they are the future of central government, attempting to transform everything.
Initiatives such as LocalGov Digital, (O2’s Local Government Digital Fund), Forum (the DCLG’s local digital alliance) and the Guardian’s many public sector forums and networks are all helping to push forward debate, spread good practice and build a sense of urgent optimism around the local government digital agenda. But at present there is no equivalent to the unified force of the Government Digital Service.”

Canadian Organizations Join Forces to Launch Open Data Institute to Foster Open Government


Press Release: “The Canadian Digital Media Network, the University of Waterloo, Communitech, OpenText and Desire2Learn today announced the creation of the Open Data Institute.

The Open Data Institute, which received support from the Government of Canada in this week’s budget, will work with governments, academic institutions and the private sector to solve challenges facing “open government” efforts and realize the full potential of “open data.”
According to a statement, partners will work on development of common standards, the integration of data from different levels of government and the commercialization of data, “allowing Canadians to derive greater economic benefit from datasets that are made available by all levels of government.”
The Open Data Institute is a public-private partnership. Founding partners will contribute $3 million in cash and in-kind contributions over three years to establish the institute, a figure that has been matched by the Government of Canada.
“This is a strategic investment in Canada’s ability to lead the digital economy,” said Kevin Tuer, Managing Director of CDMN. “Similar to how a common system of telephone exchanges allowed world-wide communication, the Open Data Institute will help create a common platform to share and access datasets.”
“This will allow the development of new applications and products, creating new business opportunities and jobs across the country,” he added.
“The Institute will serve as a common forum for government, academia and the private sector to collaborate on Open Government initiatives with the goal of fueling Canadian tech innovation,” noted OpenText President and CEO Mark J. Barrenechea
“The Open Data Institute has the potential to strengthen the regional economy and increase our innovative capacity,” added Feridun Hamdullahpur, president and vice-chancellor of the University of Waterloo.

The newsonomics of measuring the real impact of news


Ken Doctor at Nieman Journalism Lab: “Hello there! It’s me, your friendly neighborhood Tweet Button. What if you could tap me and unlock a brand new source of funding for startup news sources of all kinds? What if, even better, you the reader could tap that money loose with a single click?
That’s the delightfully simple conceit behind a little widget, Impaq.me, you may have seen popping up as you traverse the news web. It’s social. It’s viral. It uses OPM (Other People’s Money) — and maybe a little bit of your own. It makes a new case to funders and maybe commercial sponsors. And it spits out metrics around the clock. It aims to be a convergence widget, acting on that now-aging idea that our attention is as important as our wallet. Consider it a new digital Swiss Army knife for the attention economy. TWEET
It’s impossible to tell how much of an impact Impaq.me may have. It’s still in its second round of testing at six of the U.S.’s most successful independent nonprofit startups — MinnPost, Center for Investigative Reporting, The Texas Tribune, Voice of San Diego, ProPublica, and the Center for Public Integrity — but as in all things digital, timing is everything. And that timing seems right.
First, let’s consider that spate of new news sites that have sprouted with the winter rains — Bill Keller’s and Neil Barsky’s Marshall Project being only the latest. It’s been quite a run — from Ezra Klein’s Project X to Pierre Omidyar’s First Look (and just launched The Intercept) to the reimagining of FiveThirtyEight. While they encompass a broad range of business models and goals (“The newsonomics of why everyone seems to be starting a news site”), they all need two things: money and engagement. Or, maybe better ordered, engagement and money. The dance between the two is still in the early stages of Internet choreography. Get the sequences right and you win.
Second, and related, is the big question of “social” and how our sharing of news is changing the old publishing dynamic of editors deciding what we’re going to read. Just this week, two pieces here at the Lab — one on Upworthy’s influence and one on the social/search tango — highlighted the still-being-understood role of social in our news-reading lives.
Third, funders of news sites, especially Knight and other lead foundations, are looking for harder evidence of the value generated by their early grants. Millions have been poured into creating new news sites. Now they’re asking: What has our funding really done? Within that big question, Impaq.me is only one of several new attempts to demonstrably measure real impact in new ways. We’ll take a brief look at those impact initiatives below….
If Impaq.me is all about impact and money, then it’s got good company. There are at least two other noteworthy impact-measuring projects going on.

  • The Center for Investigative Reporting’s Impact Tracker effort impact-tracking initiative launched last fall. The big idea: getting beyond the traditional metrics like unique visitors and pageviews to track the value of investigative and enterprise work. To that end, CIR has hired Lindsay Green-Barber, a CUNY-trained social scientist, and given her a perhaps first-ever title: media impact analyst.We can see the fruits of the work around CIR’s impressive Returning Home to Battle veterans series. On that series, CIR is tracking such impacts as change and rise in the public discourse around veterans’ issues and related allocation of government resources. The notion of good journalism intended to shine a light in dark places has been embedded in the CIR DNA for a long time; this new effort is intended to provide data — and words — to describe progress toward solutions. CIR is working with The Seattle Times on the impact of that paper’s education reporting, and CIR may soon look at more partnerships as well. Related: CIR is holding two “Dissection” events in New York and Washington in April, bringing together journalists, funders, and social scientists to widen the media impact movement.
  • Chalkbeat, a growing national education news site, too, is moving on impact analysis. It’s called MORI (Measures of our Reporting’s Influence), and it’s a WordPress plugin. Says Chalkbeat cofounder Elizabeth Green: “We built MORI to solve for a problem that I guess you could call ‘impact loss.’ We knew that our stories were having all kinds of impacts, but we had no way of keeping track of these impacts or making sense of them. That meant that we couldn’t easily compile what we had done in the last year to share with the outside world (board, donors, foundations, readers, our moms) but also — just as important — we couldn’t look back on what we’d done and learn from it.”Sound familiar?
    After much inquiry, Chalkbeat settled on technology. “Within each story’s back end,” Green said, “we can enter inputs — qualitative data about the type of story, topic, and target audience — as well as outcomes — impacts on policy and practice (what we call ‘informed action’) as well as impacts on what we call ‘civic deliberation.’”