New Book: Untangling the Web


By Aleks Krotoski: “The World Wide Web is the most revolutionary innovation of our time. In the last decade, it has utterly transformed our lives. But what real effects is it having on our social world? What does it mean to be a modern family when dinner table conversations take place over smartphones? What happens to privacy when we readily share our personal lives with friends and corporations? Are our Facebook updates and Twitterings inspiring revolution or are they just a symptom of our global narcissism? What counts as celebrity, when everyone can have a following or be a paparazzo? And what happens to relationships when love, sex and hate can be mediated by a computer? Social psychologist Aleks Krotoski has spent a decade untangling the effects of the Web on how we work, live and play. In this groundbreaking book, she uncovers how much humanity has – and hasn’t – changed because of our increasingly co-dependent relationship with the computer. In Untangling the Web, she tells the story of how the network became woven in our lives, and what it means to be alive in the age of the Internet.” Blog: http://untanglingtheweb.tumblr.com/
 
 

Orwell is drowning in data: the volume problem


Dom Shaw in OpenDemocracy: “During World War II, whilst Bletchley Park laboured in the front line of code breaking, the British Government was employing vast numbers of female operatives to monitor and report on telephone, mail and telegraph communications in and out of the country.
The biggest problem, of course, was volume. Without even the most primitive algorithm to detect key phrases that later were to cause such paranoia amongst the sixties and seventies counterculture, causing a whole generation of drug users to use a wholly unnecessary set of telephone synonyms for their desired substance, the army of women stationed in exchanges around the country was driven to report everything and then pass it on up to those whose job it was to analyse such content for significance.
Orwell’s vision of Big Brother’s omniscience was based upon the same model – vast armies of Winston Smiths monitoring data to ensure discipline and control. He saw a culture of betrayal where every citizen was held accountable for their fellow citizens’ political and moral conformity.
Up until the US Government’s Big Data Research and Development Initiative [12] and the NSA development of the Prism programme [13], the fault lines always lay in the technology used to collate or collect and the inefficiency or competing interests of the corporate systems and processes that interpreted the information. Not for the first time, the bureaucracy was the citizen’s best bulwark against intrusion.
Now that the algorithms have become more complex and the technology tilted towards passive surveillance through automation, the volume problem becomes less of an obstacle….
The technology for obtaining this information, and indeed the administration of it, is handled by corporations. The Government, driven by the creed that suggests private companies are better administrators than civil servants, has auctioned off the job to a dozen or more favoured corporate giants who are, as always, beholden not only to their shareholders, but to their patrons within the government itself….
The only problem the state had was managing the scale of the information gleaned from so many people in so many forms. Not any more. The volume problem has been overcome.”

Rulemaking 2.0: Understanding and Getting Better Public Participation


New Report from Cynthia Farina and Mary Newhart for the The IBM Center for The Business of Government: “This report provides important insights in how governments can improve the rulemaking process by taking full advantage of Rulemaking 2.0 technology. The report’s findings and recommendations are based on five experiments with Rulemaking 2.0 conducted by CeRI researchers, four in partnership with the Department of Transportation and one with the Consumer Financial Protection Bureau.While geared specifically to achieving better public participation in rulemaking, the concepts, findings, and recommendations contained in the report are applicable to all government agencies interested in enhancing public participation in a variety of processes. The report offers advice on how government organizations can increase both the quantity and quality of public participation from specific groups of citizens, including missing stakeholders, unaffiliated experts, and the general public.The report describes three barriers to effective participation in rulemaking: lack of awareness, low participation literacy, and information overload. While the report focuses on rulemaking, these barriers also hinder public participation in other arenas.The report offers three strategies to overcome such barriers:

  • Outreach to alert and engage potential new participants
  • Converting newcomers into effective commenters
  • Making substantive rulemaking information accessible”

Create a Crowd Competition That Works


Ahmad Ashkar in HBR Blog Network: “It’s no secret that people in business are turning to the crowd to solve their toughest challenges. Well-known sites like Kickstarter and Indiegogo allow people to raise money for new projects. Design platforms like Crowdspring and 99designs give people the tools needed to crowdsource graphic design ideas and feedback.
At the Hult Prize — a start-up accelerator that challenges Millennials to develop innovative social enterprises to solve our world’s most pressing issues (and rewards the top team with $1,000,000 in start-up capital) — we’ve learned that the crowd can also offer an unorthodox solution in developing innovative and disruptive ideas, particularly ones focused on tackling complex, large-scale social issues.
But to effectively harness the power of the crowd, you have to engage it carefully. Over the past four years, we’ve developed a well-defined set of principles that guide our annual “challenge,” (lauded by Bill Clinton in TIME magazine as one of the top five initiatives changing the world for the better) that produces original and actionable ideas to solve social issues.
Companies like Netflix, General Electric, and Proctor & Gamble have also started “challenging the crowd” and employing many of these principles to tackle their own business roadblocks. If you’re looking to spark disruptive and powerful ideas that benefit your company, follow these guidelines to launch an engaging competition:
1. Define the boundaries
2. Identify a specific and bold stretch target. …
3. Insist on low barriers to entry. …
4. Encourage teams and networks. …
5. Provide a toolkit. Once interested parties become participants in your challenge, provide tools to set them up for success. If you are working on a social problem, you can use IDEO’s human-centered design toolkit. If you have a private-sector challenge, consider posting it on an existing innovation platform. As an organizer, you don’t have to spend time recreating the wheel — use one of the many existing platforms and borrow materials from those willing to share.”

Code for America: Announcing the 2013 Accelerator Class


Press Release: “Code for America opened applications for the 2013 Accelerator knowing that the competition would be fierce. This year we received over 190 applications from amazing candidates. Today, we’re pleased to announce the five teams chosen to participate in the 2013 Accelerator.

The teams are articulate, knowledgeable, and passionate about their businesses. They come from all over the country — Texas, North Carolina, Florida, and California  — and we’re excited to get started with them. Teams include:

ArchiveSocial enables organizations to embrace social media by minimizing risk and eliminating compliance barriers. Specifically, it solves the challenge of retaining Gov 2.0 communications for compliance with FOIA and other public records laws. It currently automates business-grade record keeping of communications on networks such as Facebook, Twitter, and YouTube. Moving forward, ArchiveSocial will help further enforce social media policy and protect the organizational brand.

The Family Assessment Form (FAF) Web is a tool designed by social workers, researchers, and technology experts to help family support practitioners improve family functioning, service planning for families, and organizational performance. The FAF is ideal for use in organizations performing home visitation services for families that address comprehensive concerns about family well-being and child welfare. FAF Web enables all stakeholders to access essential data remotely from any internet-enabled device.

OpenCounter helps entrepreneurs to register their businesses with the local government. It does so through an online check-out experience that adapts to the applicant’s answers and asks for pertinent information only once. OpenCounter estimates licensing time and costs so entrepreneurs can understand what it will take to get their business off the ground. It’s the TurboTax of business permitting.

SmartProcure is an online information service that provides access to local, state, and federal government procurement data, with two public-interest goals: 1. Enable government agencies to make more efficient procurement decisions and save taxpayer dollars. 2. Empower businesses to sell more effectively and competitively to government agencies. The proprietary system provides access to data from more than 50 million purchase orders issued by 1,700 government agencies.

StreetCred Software helps police agencies manage their arrest warrants, eliminate warrant backlogs, and radically improve efficiency while increasing officer safety. It helps agencies understand their fugitive population, measure effectiveness, and make improvements. StreetCred Software, Inc., was founded by two Texas police officers. One is an 18-year veteran investigator and fugitive hunter, the other a technology industry veteran who became an cop in 2010.”

Information Consumerism – The Price of Hypocrisy


Evgeny Morozov in Frankfurter Algemeine: “What we need is a sharper, starker picture of the information apocalypse that awaits us in a world where personal data is traded like coffee or any other commodity. Take the oft-repeated argument about the benefits of trading one’s data in exchange for some tangible commercial benefit. Say, for example, you install a sensor in your car to prove to your insurance company that you are driving much safer than the average driver that figures in their model for pricing insurance policies. Great: if you are better than the average, you get to pay less. But the problem with averages is that half of the population is always worse than the benchmark. Inevitably –regardless of whether they want to monitor themselves or not – that other half will be forced to pay more, for as the more successful of us take on self-tracking, most social institutions would (quite logically) assume that those who refuse to self-track have something to hide. Under this model, the implications of my decision to trade my personal data are no longer solely in the realm of markets and economics – they are also in the realm of ethics. If my decision to share my personal data for a quick buck makes someone else worse off and deprives them of opportunities, then I have an extra ethical factor to consider – economics alone doesn’t suffice.
All of this is to say that there are profound political and moral consequences to information consumerism– and they are comparable to energy consumerism in scope and importance. Making these consequences more pronounced and vivid is where intellectuals and political parties ought to focus their efforts. We should do our best to suspend the seeming economic normalcy of information sharing. An attitude of “just business!” will no longer suffice. Information sharing might have a vibrant market around it but it has no ethical framework to back it up. More than three decades ago, Michel Foucault was prescient to see that neoliberalism would turns us all into “entrepreneurs of the self” but let’s not forget that entrepreneurship is not without its downsides: as most economic activities, it can generate negative externalities, from pollution to noise. Entrepreneurship focused on information sharing is no exception….”

9 models to scale open data – past, present and future


Open Knowledge Foundation Blog: “The possibilities of open data have been enthralling us for 10 years…But that excitement isn’t what matters in the end. What matters is scale – which organisational structures will make this movement explode?  This post quickly and provocatively goes through some that haven’t worked (yet!) and some that have.
Ones that are working now
1) Form a community to enter in new data. Open Street Map and MusicBrainz are two big examples. It works as the community is the originator of the data. That said, neither has dominated its industry as much as I thought they would have by now.
2) Sell tools to an upstream generator of open data. This is what CKAN does for central Governments (and the new ScraperWiki CKAN tool helps with). It’s what mySociety does, when selling FixMyStreet installs to local councils, thereby publishing their potholes as RSS feeds.
3) Use open data (quietly). Every organisation does this and never talks about it. It’s key to quite old data resellers like Bloomberg. It is what most of ScraperWiki’s professional services customers ask us to do. The value to society is enormous and invisible. The big flaw is that it doesn’t help scale supply of open data.
4) Sell tools to downstream users. This isn’t necessarily open data specific – existing software like spreadsheets and Business Intelligence can be used with open or closed data. Lots of open data is on the web, so tools like the new ScraperWiki which work well with web data are particularly suited to it.
Ones that haven’t worked
5) Collaborative curation ScraperWiki started as an audacious attempt to create an open data curation community, based on editing scraping code in a wiki. In its original form (now called ScraperWiki Classic) this didn’t scale. …With a few exceptions, notably OpenCorporates, there aren’t yet open data curation projects.
6) General purpose data marketplaces, particularly ones that are mainly reusing open data, haven’t taken off. They might do one day, however I think they need well-adopted higher level standards for data formatting and syncing first (perhaps something like dat, perhaps something based on CSV files).
Ones I expect more of in the future
These are quite exciting models which I expect to see a lot more of.
7) Give labour/money to upstream to help them create better data. This is quite new. The only, and most excellent, example of it is the UK’s National Archive curating the Statute Law Database. They do the work with the help of staff seconded from commercial legal publishers and other parts of Government.
It’s clever because it generates money for upstream, which people trust the most, and which has the most ability to improve data quality.
8) Viral open data licensing. MySQL made lots of money this way, offering proprietary dual licenses of GPLd software to embedded systems makers. In data this could use OKFN’s Open Database License, and organisations would pay when they wanted to mix the open data with their own closed data. I don’t know anyone actively using it, although Chris Taggart from OpenCorporates mentioned this model to me years ago.
9) Corporations release data for strategic advantage. Companies are starting to release their own data for strategic gain. This is very new. Expect more of it.”

Understanding Smart Data Disclosure Policy Success: The Case of Green Button


New Paper by Djoko Sigit Sayogo and Theresa Pardo: “Open data policies are expected to promote innovations that stimulate social, political and economic change. In pursuit of innovation potential, open datahas expanded to wider environment involving government, business and citizens. The US government recently launched such collaboration through a smart data policy supporting energy efficiency called Green Button. This paper explores the implementation of Green Button and identifies motivations and success factors facilitating successful collaboration between public and private organizations to support smart disclosure policy. Analyzing qualitative data from semi-structured interviews with experts involved in Green Button initiation and implementation, this paper presents some key findings. The success of Green Button can be attributed to the interaction between internal and external factors. The external factors consist of both market and non-market drivers: economic factors, technology related factors, regulatory contexts and policy incentives, and some factors that stimulate imitative behavior among the adopters. The external factors create the necessary institutional environment for the Green Button implementation. On the other hand, the acceptance and adoption of Green Button itself is influenced by the fit of Green Button capability to the strategic mission of energy and utility companies in providing energy efficiency programs. We also identify the different roles of government during the different stages of Green Button implementation.”
[Recipient of Best Management/Policy Paper Award, dgo2013]

Open Data Tools: Turning Data into ‘Actionable Intelligence’


Shannon Bohle in SciLogs: “My previous two articles were on open access and open data. They conveyed major changes that are underway around the globe in the methods by which scientific and medical research findings and data sets are circulated among researchers and disseminated to the public. I showed how E-science and ‘big data’ fit into the philosophy of science though a paradigm shift as a trilogy of approaches: deductive, empirical, and computational, which was pointed out, provides a logical extenuation of Robert Boyle’s tradition of scientific inquiry involving “skepticism, transparency, and reproducibility for independent verification” to the computational age…
This third article on open access and open data evaluates new and suggested tools when it comes to making the most of the open access and open data OSTP mandates. According to an article published in The Harvard Business Review’s “HBR Blog Network,” this is because, as its title suggests, “open data has  little value if people can’t use it.” Indeed, “the goal is for this data to become actionable intelligence: a launchpad for investigation, analysis, triangulation, and improved decision making at all levels.” Librarians and archivists have key roles to play in not only storing data, but packaging it for proper accessibility and use, including adding descriptive metadata and linking to existing tools or designing new ones for their users. Later, in a comment following the article, the author, Craig Hammer, remarks on the importance of archivists and international standards, “Certified archivists have always been important, but their skillset is crucially in demand now, as more and more data are becoming available. Accessibility—in the knowledge management sense—must be on par with digestibility / ‘data literacy’ as priorities for continuing open data ecosystem development. The good news is that several governments and multilaterals (in consultation with data scientists and – yep! – certified archivists) are having continuing ‘shared metadata’ conversations, toward the possible development of harmonized data standards…If these folks get this right, there’s a real shot of (eventual proliferation of) interoperability (i.e. a data platform from Country A can ‘talk to’ a data platform from Country B), which is the only way any of this will make sense at the macro level.”

Participatory Democracy in the New Millenium


New literature review in Contemporary Sociology by Francesca Polletta: “By the 1980s, experiments in participatory democracy seemed to have been relegated by scholars to the category of quixotic exercises in idealism, undertaken by committed (and often aging) activists who were unconcerned with political effectiveness or economic efficiency. Today, bottom-up decision making seems all the rage. Crowdsourcing and Open Source, flat management in business, horizontalism in protest politics, collaborative governance in policymaking—these are the buzzwords now and they are all about the virtues of nonhierarchical and participatory decision making.

What accounts for this new enthusiasm for radical democracy? Is it warranted? Are champions of this form understanding key terms like equality and consensus differently than did radical democrats in the 1960s and 70s? And is there any reason to believe that today’s radical democrats are better equipped than their forebears to avoid the old dangers of endless meetings and rule by friendship cliques? In this admittedly selective review, I will take up recent books on participatory democracy in social movements, non- and for-profit organizations, local governments, and electoral campaigning. These are perhaps not the most influential books on participatory democracy since 2000—after all, most of them are brand new—but they speak interestingly to the state of participatory democracy today. Taken together, they suggest that, on one hand, innovations in technology and in activism have made democratic decision making both easier and fairer. On the other hand, the popularity of radical democracy may be diluting its force. If radical democracy comes to mean simply public participation, then spectacles of participation may be made to stand in for mechanisms of democratic accountability.”