UK launches Information Economy Strategy


Open Data Institute: “The Information Economy Strategy sets out a range of key actions, including:

  • Digitally transforming 25 of the top 50 UK public services over the next 300 days, including plans to give businesses a single, online view of their tax records
  • Launching a new programme to help 1.6 million SMEs scale up their business online over the next five years.
  • Publishing a data capability strategy in October 2013, developed in partnership with government, industry and academia. The strategy will build on the recommendations in Stephan Shakespeare’s review of Public Sector Information and the Prime Minister’s Council for Science and Technology’s report on algorithms, and will be published alongside the Open Government Partnership National Action Plan.
  • Establishing the world’s first facility for testing state of the art 5G mobile technology, working with industry and the University of Surrey.”

ResearchGate Tackles Social Networking for Scientists


Screen-Shot-2013-06-04-at-11.19.01-AM-610x398Meredith Salisbury from Techonomy: “Social networking for scientists has been tried before, but not until recently have we seen investors placing big bets in this area. Earlier this year, the academic networking site Mendeley was acquired by scientific publisher Elsevier for somewhere in the ballpark of $70 million. And today brings a new data point: Berlin-based ResearchGate, a site designed to facilitate collaborations and data sharing among scientists around the world, has raised $35 million in a series C round from investors including Bill Gates….While social networking has upended how business happens in other industries, the centuries-old traditions of the scientific field have largely blocked this kind of change. Sure, scientists sign up for Facebook and LinkedIn like anybody else. But use a social networking tool to facilitate research, find partners, and share data that hasn’t yet been published? That’s been a tough sell in the hyper-competitive, highly specialized scientific community….But ResearchGate’s Madisch believes he is making inroads—and that his latest round of funding, along with its big-name investors, is proof of that. The site boasts 2.8 million users, and features a number of tools and capabilities designed to lure scientists. Madisch knows that scientists are unlikely to share data that could be included in a valuable peer-reviewed publication, so instead he encourages users to share data from failed experiments that will never be submitted for publication anyway. There’s a lot less possessiveness around that data, and Madisch contends that failures are just as important as successes in helping people understand what works under certain circumstances.Another widget calculates a scientist’s reputation score based on interactions within ResearchGate; this number offers an alternative way to look at any scientist’s impact within the field beyond the current gold standard, which simply associates a person’s value with the reputation of the journals he or she gets published in.
What’s most important to Madisch, though, is the site’s ability to connect scientists around the world and allow better research to happen faster. He cites the example of a young child who died from an unknown cause in Nigeria; a local doctor sent samples to a scientist in Italy he found on ResearchGate, and together they identified a new pathogen responsible for the child’s death. Further analysis was conducted by a Dutch researcher, also found through the networking site.
For Madisch, this anecdote embodies the ResearchGate mentality: The more you collaborate, the more successful you’ll be.”

It’s Time to Rewrite the Internet to Give Us Better Privacy, and Security


Larry Lessig in The Daily Beast: “Almost 15 years ago, as I was just finishing a book about the relationship between the Net (we called it “cyberspace” then) and civil liberties, a few ideas seemed so obvious as to be banal: First, life would move to the Net. Second, the Net would change as it did so. Gone would be simple privacy, the relatively anonymous default infrastructure for unmonitored communication; in its place would be a perpetually monitored, perfectly traceable system supporting both commerce and the government. That, at least, was the future that then seemed most likely, as business raced to make commerce possible and government scrambled to protect us (or our kids) from pornographers, and then pirates, and now terrorists.

But another future was also possible, and this was my third, and only important point: Recognizing these obvious trends, we just might get smart about how code (my shorthand for the technology of the Internet) regulates us, and just possibly might begin thinking smartly about how we could embed in that code the protections that the Constitution guarantees us. Because—and here was the punchline, the single slogan that all 724 people who read that book remember—code is law. And if code is law, then we need to be as smart about how code regulates us as we are about how the law does so….
But what astonishes me is that today, more than a decade into the 21st century, the world has remained mostly oblivious to these obvious points about the relationship between law and code….
the fact is that there is technology that could be deployed that would give many the confidence that none of us now have. “Trust us” does not compute. But trust and verify, with high-quality encryption, could. And there are companies, such as Palantir, developing technologies that could give us, and more importantly, reviewing courts, a very high level of confidence that data collected or surveilled was not collected or used in an improper way. Think of it as a massive audit log, recording how and who used what data for what purpose. We could code the Net in a string of obvious ways to give us even better privacy, while also enabling better security.

The Use of Data Visualization in Government


Report by Genie Stowers for The IBM Center for The Business of Government: “The purpose of this report is to help public sector managers understand one of the more important areas of data analysis today—data visualization. Data visualizations are more sophisticated, fuller graphic designs than the traditional spreadsheet charts, usually with more than two variables and, typically, incorporating interactive features. Data are here to stay, growing exponentially, and data analysis is taking off, pushed forward as a result of the convergence of:
• New technologies
• Open data and big data movements
• The drive to more effectively engage citizens
• The creation and distribution of more and more data…
This report contains numerous examples of visualizations that include geographical and health data, or population and time data, or financial data represented in both absolute and relative terms—and each communicates more than simply the data that underpin it.In addition to these many examples of visualizations, the report discusses the history of this technique, and describes tools that can be used to create visualizations from many different kinds of data sets. Government managers can use these tools—including Many Eyes, Tableau, and HighCharts—to create their own visualizations from their agency’s data.
The report presents case studies on how visualization techniques are now being used by two local governments, one state government,and three federal government agencies. Each case study discusses the audience for visualization. Understanding audience is important, as government organizations provide useful visualizations to different audiences, including the media, political oversight organizations, constituents, and internal program teams.To assist in effectively communicating to these audiences, the report details attributes of meaningful visualizations: relevance,meaning, beauty, ease of use, legibility, truthfulness, accuracy,and consistency among them.”

Big Data Is Not Our Master. Humans create technology. Humans can control it.


Chris Hughes in New Republic: “We’ve known for a long time that big companies can stalk our every digital move and customize our every Web interaction. Our movements are tracked by credit cards, Gmail, and tollbooths, and we haven’t seemed to care all that much.
That is, until this week’s news of government eavesdropping, with the help of these very same big companies—Verizon, Facebook, and Google, among others. For the first time, America is waking up to the realities of what all this information—known in the business as “big data”—enables governments and corporations to do….
We are suddenly wondering, Can the rise of enormous data systems that enable this surveillance be stopped or controlled? Is it possible to turn back the clock?
Technologists see the rise of big data as the inevitable march of history, impossible to prevent or alter. Viktor Mayer-Schönberger and Kenneth Cukier’s recent book Big Data is emblematic of this argument: They say that we must cope with the consequences of these changes, but they never really consider the role we play in creating and supporting these technologies themselves….
But these well-meaning technological advocates have forgotten that as a society, we determine our own future and set our own standards, norms, and policy. Talking about technological advancements as if they are pre-ordained science erases the role of human autonomy and decision-making in inventing our own future. Big data is not a Leviathan that must be coped with, but a technological trend that we have made possible and support through social and political policy.”

A Citizen’s Guide to Open Government, E-Government, and Government 2.0


Inside the MPA@UNC Blog: “Engaged citizens want clear, credible information from the government about how it’s carrying on its business. They don’t want to thumb through thousands of files or wait month after month or go through the rigors of filing claims through FOIA (Freedom of Information Act). They want government information, services, and communication to be forthcoming and swift. The Open Government, Government 2.0, and E-Governance movements fill the need of connecting citizens with the government and each other to foster a more open, collaborative, and efficient public sector through the use of new technology and public data.
Open Government is defined by the OECD (Organisation for Economic Cooperation and Development) as “the transparency of government actions, the accessibility of government services and information, and the responsiveness of government to new ideas, demands and needs.”
E-Government is defined by the World Bank as “the use by government agencies of information technologies that have the ability to transform relations with citizens, businesses, and other arms of government. These technologies can serve a variety of different ends: better delivery of government services to citizens, improved interactions with business and industry, citizen empowerment through access to information, or more efficient government management. The resulting benefits can be less corruption, increased transparency, greater convenience, revenue growth, and/or cost reductions.”
Government 2.0 is defined by Gartner Research as “the use of Web 2.0 technologies, both internally and externally, to increase collaboration and transparency and potentially transform the way government agencies relate to citizens and operate.”
Open Government and E-Government paved the way for Government 2.0, a collaborative technology whose mission is to improve government transparency and efficiency. How? Gov 2.0 has been called the next generation of government because it not only utilizes new technologies such as social media, cloud computing, and other apps, it is a means to increase citizen participation….
We have compiled a list of organizations, blogs, guides, and tools to help citizens and public service leaders better understand the Open Government, E-Government, and Government 2.0 movement….”

The "audience" as participative, idea generating, decision making citizens: will they transform government?


Paper by Teresa Harrison in latest issue of the Journal “Participations“: “From a purely technical perspective, no one, including Tim Berners-Lee, has ever been able to pinpoint exactly what makes Web 2.0 unique. What may be most accurate to say, is that the enormous popularity of social networking and other social media technologies hinges on a radical reconceptualisation of the audience, now routinely incorporated into ICT applications. Once treated as passive consumers of content created by others, designers of these applications now appreciate and exploit, the fact that new media users (formerly known as the “audience”) actively create content online to serve their own goals, frequently as they interact with others. Users of Web 2.0 applications display and tinker with their identities, express themselves on all kinds of topics, invent new products and ideas, and, as Don Tapscott, Tim O’Reilly and legions of other business gurus hasten to remind us, are willing, so far at least, to lend their problem solving, creative efforts and intellectual products to businesses seeking to innovate, or just looking for free marketing. Hoping to harness this largely uncompensated labour, organisations of all types (both commercial and non-profit) have been quick to find ways to attract these “producers” to their projects.
Governments have not been the swiftest in this regard, however, they may present the most ambitious and optimistic agenda for involving internet users. Nations around the world now hope to use new media to engage their citizens in some variation of participatory governance. Where once the prospects for “town hall style democracy,” were doomed by the limitations and inefficiencies of one-way media transactions, the networked interactivity of social media now makes it technically feasible to invite citizen participation on a routine basis. What is not yet clear is how citizens will react over the long term to these invitations and what kinds of social issues and software applications will best attract and immerse them into new citizenship practices.”

Don’t Reengineer. Reimagine.


Jeff Schumacher, Simon MacGibbon, and Sean Collins in Strategy + Business: “What does it mean to become digital? Companies in all industries are building online businesses, enabling new customer experiences, experimenting with “big data,” and seeking advantage in a digitally enabled business environment. They have tried reengineering their practices; they have set up new technological platforms for customer engagement and back-office efficiency. But these efforts have not yet had the impact that they should. Instead of reengineering, they need reimagining. They need to conceive of their business freshly, in line with the capabilities that digital and business technologies can give them, connecting to customers in ways that have not been possible before….
The third principle of digitization involves taking the long view, even as you build for today. You can no longer succeed with a digital strategy based only on today’s technology and competitive environment. Nor is it enough to merely ideate about future developments. Companies must take actions now that prepare them for the disruptive opportunities and evolving platforms of the next few years. What technologies might be available then? How will customers be using digital in their lives? Where will your industry be, for example, in terms of responsive use of data, digital fabrication (parts and devices made on the fly), cloud-based interoperability, or new forms of supply chain coordination? Do you have the capabilities now to make use of those technologies in creating new customer experiences? And what new capabilities will you need once those technologies become reality?…
The fourth principle recognizes that becoming digital isn’t just a matter of rearranging the lines and boxes on your org chart. It involves fostering a startup’s way of working through new structures and teams, and changing your incentives, rules, and decision rights accordingly. Just as important as these formal mechanisms are their informal counterparts—the personal networks, communities of interest, information flows, and behavioral norms—that link the people in your company who can imagine and build new digital capabilities.”

Health Datapalooza just wrapped


I’ve just finished two packed days at the Health Datapalooza, put on by the Health Data Consortium with the Department of Health and Human Services. As I’ve just heard someone say, many of the 2000 people here are a bit “Palooza”d out.” But this fourth annual event shows the growing power of open government data on health and health care services. The two-day event covered both the knowledge and applications that can come from the release of data like that on Medicare claims, and the ways in which the Affordable Care Act is driving the use of data for better delivery of high-quality care. The participation of leaders from the United Kingdom’s National Health Service added an international perspective as well.
There’s too much to summarize in a single blog post, but you can follow these links to read about the Health Data Consortium and its new CEO’s goals; the DataPalooza’s opening plenary session, with luminaries from government, business, and the New Yorker; and today’s keynote by Todd Park, with reflections on some of new companies that open government data is supporting.
– Joel Gurin, GovLab network member and Founder and Editor, OpenDataNow.com

Complex Algorithm Auto-Writes Books, Could Transform Science


Mashable: “Could a sophisticated algorithm be the future of science? One innovative economist thinks so.
Phil Parker, who holds a doctorate in business economics from the Wharton School, has built  an algorithm that auto-writes books. Now he’s taking that model and applying it to loftier goals than simply penning periodicals: namely, medicine and forensics. Working with professors and researchers at NYU, Parker is trying to decode complex genetic structures and find cures for diseases. And he’s doing it with the help of man’s real best friend: technology.
Parker’s recipe is a complex computer program that mimics formulaic writing….
Parker’s been at this for years. His formula, originally used for printing, is able to churn out entire books in minutes. It’s similar to the work being done by Narrative Science and StatSheet, except those companies are known for short form auto-writing for newspapers. Parker’s work is much longer, focusing on obscure non-fiction and even poetry.
It’s not creative writing, though, and Parker isn’t interested in introspection, exploring emotion or storytelling. He’s interested in exploiting reproducible patterns — that’s how his algorithm can find, collect and “write” so quickly. And how he can apply that model to other disciplines, like science.
Parker’s method seems to be a success; indeed, his ICON Group International, Inc., has auto-written so many books that Parker has lost count. But this isn’t the holy grail of literature, he insists. Instead, he says, his work is a play on mechanizing processes to create a simple formula. And he thinks that “finding new knowledge structures within data” stretches far beyond print.”