Report “Big and open data in Europe: A growth engine or a missed opportunity?”


Press Release: “Big data and open data are not just trendy issues, they are the concern of the government institutions at the highest level. On January 29th, 2014 a Conference concerning Big & Open Data in Europe 2020 was held in the European Parliament.
Questions were asked and discussed like: Is Big & Open Data a truly transformative phenomena or just a ‘hot air’? Does it matter for Europe? How big is the economic potential of Big and Open Data for Europe till 2020? How each of the 28 Member States may benefit from it?…
The conference complemented a research project by demosEUROPA – Centre for European Strategy on Big and Open Data in Europe that aims at fostering and facilitating policy debate on the socioeconomic impact of data. The key outcome of the project, a pan-European macroeconomic study titledBig and open data In Europe: A growth engine or a missed opportunity?” carried out by the Warsaw Institute for Economic Studies (WISE) was presented.
We have the pleasure to be one of the first to present some of the findings of the report and offer the report for download.
The report analyses how technologies have the potential to influence various aspects of the European society, about their substantial, long term impact on our wealth and quality of life, but also about the new developmental challenges for the EU as a whole – as well as for its member states and their regions.
You will learn from the report:
–  the resulting economic gains of business applications of big data
– how to structure big data to move from Big Trouble to Big Value
– the costs and benefits of opening data to holders
– 3 challenges that  Europeans face with respect to big and open data
– key areas, growth opportunities and challenges for big and open data in Europe per particular regions.
The study also elaborates on the key principle of open data philosophy, which is open by default.
Europe by 2020. What will happen?
The report contains a prognosis for the 28 countries from the EU about the impact of big and open data from 2020 and its additional output and how it will affect trade, health, manufacturing, information and communication, finance & insurance and public administration in different regions. It foresees that the EU economy will grow by 1.9% by 2020 thanks to big and open data and describes the increase of the general GDP level by countries and sectors.
One of the many interesting findings of the report is that the positive impact of the data revolution will be felt more acutely in Northern Europe, while most of the New Member States and Southern European economies will benefit significantly less, with two notable exceptions being the Czech Republic and Poland. If you would like to have first-hand up-to-date information about the impact of big and open data on the future of Europe – download the report.”

The Age of ‘Infopolitics’


Colin Koopman in the New York Times: “We are in the midst of a flood of alarming revelations about information sweeps conducted by government agencies and private corporations concerning the activities and habits of ordinary Americans. After the initial alarm that accompanies every leak and news report, many of us retreat to the status quo, quieting ourselves with the thought that these new surveillance strategies are not all that sinister, especially if, as we like to say, we have nothing to hide.
One reason for our complacency is that we lack the intellectual framework to grasp the new kinds of political injustices characteristic of today’s information society. Everyone understands what is wrong with a government’s depriving its citizens of freedom of assembly or liberty of conscience. Everyone (or most everyone) understands the injustice of government-sanctioned racial profiling or policies that produce economic inequality along color lines. But though nearly all of us have a vague sense that something is wrong with the new regimes of data surveillance, it is difficult for us to specify exactly what is happening and why it raises serious concern, let alone what we might do about it.
Our confusion is a sign that we need a new way of thinking about our informational milieu. What we need is a concept of infopolitics that would help us understand the increasingly dense ties between politics and information. Infopolitics encompasses not only traditional state surveillance and data surveillance, but also “data analytics” (the techniques that enable marketers at companies like Target to detect, for instance, if you are pregnant), digital rights movements (promoted by organizations like the Electronic Frontier Foundation), online-only crypto-currencies (like Bitcoin or Litecoin), algorithmic finance (like automated micro-trading) and digital property disputes (from peer-to-peer file sharing to property claims in the virtual world of Second Life). These are only the tip of an enormous iceberg that is drifting we know not where.
Surveying this iceberg is crucial because atop it sits a new kind of person: the informational person. Politically and culturally, we are increasingly defined through an array of information architectures: highly designed environments of data, like our social media profiles, into which we often have to squeeze ourselves. The same is true of identity documents like your passport and individualizing dossiers like your college transcripts. Such architectures capture, code, sort, fasten and analyze a dizzying number of details about us. Our minds are represented by psychological evaluations, education records, credit scores. Our bodies are characterized via medical dossiers, fitness and nutrition tracking regimens, airport security apparatuses. We have become what the privacy theorist Daniel Solove calls “digital persons.” As such we are subject to infopolitics (or what the philosopher Grégoire Chamayou calls “datapower,” the political theorist Davide Panagia “datapolitik” and the pioneering thinker Donna Haraway “informatics of domination”).
Today’s informational person is the culmination of developments stretching back to the late 19th century. It was in those decades that a number of early technologies of informational identity were first assembled. Fingerprinting was implemented in colonial India, then imported to Britain, then exported worldwide. Anthropometry — the measurement of persons to produce identifying records — was developed in France in order to identify recidivists. The registration of births, which has since become profoundly important for initiating identification claims, became standardized in many countries, with Massachusetts pioneering the way in the United States before a census initiative in 1900 led to national standardization. In the same era, bureaucrats visiting rural districts complained that they could not identify individuals whose names changed from context to context, which led to initiatives to universalize standard names. Once fingerprints, biometrics, birth certificates and standardized names were operational, it became possible to implement an international passport system, a social security number and all other manner of paperwork that tells us who someone is. When all that paper ultimately went digital, the reams of data about us became radically more assessable and subject to manipulation, which has made us even more informational.
We like to think of ourselves as somehow apart from all this information. We are real — the information is merely about us. But what is it that is real? What would be left of you if someone took away all your numbers, cards, accounts, dossiers and other informational prostheses? Information is not just about you — it also constitutes who you are….”

The Moneyball Effect: How smart data is transforming criminal justice, healthcare, music, and even government spending


TED: “When Anne Milgram became the Attorney General of New Jersey in 2007, she was stunned to find out just how little data was available on who was being arrested, who was being charged, who was serving time in jails and prisons, and who was being released. It turns out that most big criminal justice agencies like my own didn’t track the things that matter,” she says in today’s talk, filmed at TED@BCG. “We didn’t share data, or use analytics, to make better decisions and reduce crime.”
Milgram’s idea for how to change this: “I wanted to moneyball criminal justice.”
Moneyball, of course, is the name of a 2011 movie starring Brad Pitt and the book it’s based on, written by Michael Lewis in 2003. The term refers to a practice adopted by the Oakland A’s general manager Billy Beane in 2002 — the organization began basing decisions not on star power or scout instinct, but on statistical analysis of measurable factors like on-base and slugging percentages. This worked exceptionally well. On a tiny budget, the Oakland A’s made it to the playoffs in 2002 and 2003, and — since then — nine other major league teams have hired sabermetric analysts to crunch these types of numbers.
Milgram is working hard to bring smart statistics to criminal justice. To hear the results she’s seen so far, watch this talk. And below, take a look at a few surprising sectors that are getting the moneyball treatment as well.

Moneyballing music. Last year, Forbes magazine profiled the firm Next Big Sound, a company using statistical analysis to predict how musicians will perform in the market. The idea is that — rather than relying on the instincts of A&R reps — past performance on Pandora, Spotify, Facebook, etc can be used to predict future potential. The article reads, “For example, the company has found that musicians who gain 20,000 to 50,000 Facebook fans in one month are four times more likely to eventually reach 1 million. With data like that, Next Big Sound promises to predict album sales within 20% accuracy for 85% of artists, giving labels a clearer idea of return on investment.”
Moneyballing human resources. In November, The Atlantic took a look at the practice of “people analytics” and how it’s affecting employers. (Billy Beane had something to do with this idea — in 2012, he gave a presentation at the TLNT Transform Conference called “The Moneyball Approach to Talent Management.”) The article describes how Bloomberg reportedly logs its employees’ keystrokes and the casino, Harrah’s, tracks employee smiles. It also describes where this trend could be going — for example, how a video game called Wasabi Waiter could be used by employers to judge potential employees’ ability to take action, solve problems and follow through on projects. The article looks at the ways these types of practices are disconcerting, but also how they could level an inherently unequal playing field. After all, the article points out that gender, race, age and even height biases have been demonstrated again and again in our current hiring landscape.
Moneyballing healthcare. Many have wondered: what about a moneyball approach to medicine? (See this call out via Common Health, this piece in Wharton Magazine or this op-ed on The Huffington Post from the President of the New York State Health Foundation.) In his TED Talk, “What doctors can learn from each other,” Stefan Larsson proposed an idea that feels like something of an answer to this question. In the talk, Larsson gives a taste of what can happen when doctors and hospitals measure their outcomes and share this data with each other: they are able to see which techniques are proving the most effective for patients and make adjustments. (Watch the talk for a simple way surgeons can make hip surgery more effective.) He imagines a continuous learning process for doctors — that could transform the healthcare industry to give better outcomes while also reducing cost.
Moneyballing government. This summer, John Bridgeland (the director of the White House Domestic Policy Council under President George W. Bush) and Peter Orszag (the director of the Office of Management and Budget in Barack Obama’s first term) teamed up to pen a provocative piece for The Atlantic called, “Can government play moneyball?” In it, the two write, “Based on our rough calculations, less than $1 out of every $100 of government spending is backed by even the most basic evidence that the money is being spent wisely.” The two explain how, for example, there are 339 federally-funded programs for at-risk youth, the grand majority of which haven’t been evaluated for effectiveness. And while many of these programs might show great results, some that have been evaluated show troubling results. (For example, Scared Straight has been shown to increase criminal behavior.) Yet, some of these ineffective programs continue because a powerful politician champions them. While Bridgeland and Orszag show why Washington is so averse to making data-based appropriation decisions, the two also see the ship beginning to turn around. They applaud the Obama administration for a 2014 budget with an “unprecendented focus on evidence and results.” The pair also gave a nod to the nonprofit Results for America, which advocates that for every $99 spent on a program, $1 be spent on evaluating it. The pair even suggest a “Moneyball Index” to encourage politicians not to support programs that don’t show results.
In any industry, figuring out what to measure, how to measure it and how to apply the information gleaned from those measurements is a challenge. Which of the applications of statistical analysis has you the most excited? And which has you the most terrified?”

Innovation in the Government Industry


in Huffington Post: “Government may be susceptible to the same forces that are currently changing many major industries. Software is eating government, too. Therefore government must use customer development to better serve customers else it risks becoming the next Blockbuster, Borders, or what the large publishing and financial services companies are at risk of becoming…
Government is currently one size fits all. In a free market, there is unblunding and multiple offerings for different segments of a market. For example there’s Natural Light, Budweiser, and Guinness. Competition forces companies to serve customers because if customers don’t like one offering they will simply choose a different one. If you don’t like your laundromat, restaurant, or job, you can simply go somewhere else. In contrast, switching governments is really hard.
Why Now
Government has been able to go a very long time without significant innovation. However now is the time for government to begin adapting because the forces changing nearly every industry may do the same to government. I will reiterate a few themes that Fred Wilson cited in a talk at LeWeb while talking about several different industries and add some more thoughts.
1. Organization: Technology driven networks replacing bureaucratic hierarchies
Bureaucratic hierarchies involve chains of command with lower levels of management making more detailed decisions and reporting back to higher levels of management. These systems often entail long communication lags, high costs, and principal/agent problems.
Technology driven networks are providing more efficient systems for organization and communication. For example, Amazon has changed the publishing industry by enabling anyone to publish content and enabling customers to decide what they want. Twitter has created a network around communication and news, enabling anyone who people want to hear to be heard.
2. Competition: Unbundling of product and service offerings
Technology advancements have made it cheaper and easier than ever before to produce a product and bring it to market. One result is that it’s become easier for an entrepreneur to provide one offering of a larger offering as a standalone offering. It provides customers with the option to buy what they want without having to pay more for stuff they don’t want. In addition, the offerings can be improved because producers are completely focused on that specific offering. For example, we used to buy one newspaper and get world, local, sports, etc. Now it’s all from different sources.
Bundling exists because it was more efficient than attempting to contract in the market for every tiny service. However some of the technology driven networks (as described above) are helping markets become more efficient and giving customers more customizable buying options. For example, you can buy a half hour of education, or borrow money from a peer.
We’re starting to see some of the governments offerings begin to be unbundled. For example, Uber and Hyperloop are providing transportation. A neighborhood in Oakland crowdfunded private security.
3. Finance: Lower payment transaction fees and crowdfunding
Innovation in payments, including Bitcoin, has made it cheaper and easier than ever to transfer money. It’s as easy as sending an email, clicking a hyperlink, or scanning a QR code. In addition, Bitcoin is not controlled by any regulators or intermediaries like the government, credit card companies, or even PayPal.
Crowdfunding enables the collective efforts of individuals to connect and pool their money to back initiatives, make purchases, or fund new projects. A school in Houston crowdfunded some exercise equipment instead of using government funding.
4. Communication: We are all connected and graphed
Mobile devices have become nearly as powerful as desktops or laptops. There are many things we can do with our phone that we can’t do on our desktop/laptop. For example, smartphones have sensors, are location aware, can be carried with us at all times, and are cheaper than desktops or laptops. These factors have lead to mass adoption of mobile devices across the world, including in countries with high poverty where people could not previously afford a desktop or laptop. Mobile is making innovative offerings like Uber and mobile payments possible.
Platforms like Facebook and Twitter provide everyone with access to millions of people. In addition, companies like Klout and Quora are measuring our reputation and social graph improving our ability to transact with each other. For example, when market participants trust one another (through the vehicle of a reputation system) many transactions that wouldn’t otherwise happen can now happen.This illustrated in the rise in popularity of collaborative consumption platforms and peer to peer marketplaces.
Serving Customers
The current government duopoly inhibits us from selecting the government that we want as well as from receiving the best possible service because of lack of incentive. However the technologies described above are making it possible to get services previously provided by the government through more efficient and effective means. They’re enabling a more free market for government services….
If government were to take the customer development route, it could try things like unbundling (see above) so that people could opt for the specific solutions they desire. Given the US government’s current balance sheet, it may actually need to start relying on other providers.
It could also rely more on “economic feedback” to inform its actions. Currently economic feedback is given through voting. Most people vote once every two or four years and then hope they get what they “paid” for. Can you imagine paying for a college without knowing which one you would be going to, know what they would be providing, or being able to request a refund or switch colleges? With more economic incentive, services would need to improve. For example, if there was a free market for roads, people would pay for and use the roads that were most safe.”

Open Data and Clinical Trials


Editorial by Jeffrey M. Drazen, M.D at NEJM.org :”In the fall of 2013, the Institute of Medicine (IOM) convened a committee, on which I serve, to examine the sharing of data in the setting of clinical trials. The committee is charged with reviewing current practices on data sharing in the context of randomized, controlled trials and with making recommendations for future data-sharing standards. Over the past few months, the committee has prepared a draft report that reviews current practices on data sharing and lays out a number of potential data-sharing models. Full details regarding the committee’s charge and the interim report are available at www.iom.edu/activities/research/sharingclinicaltrialdata.aspx….
Open-data advocates argue that all the study data should be available to anyone at the time the first report is published or even earlier. Others argue that to maintain an incentive for researchers to pursue clinical investigations and to give those who gathered the data a chance to prepare and publish further reports, there should be a period of some specified length during which the data gatherers would have exclusive access to the information. Since these researchers could always agree to collaborate with others who were not involved in the study in order to use the data to help answer a scientific question, the period of exclusivity would really apply only to noncollaborative use of the data. That is, there would be a defined period during which the data would not be available to those who wanted to perform their own analyses and draw conclusions that could, for example, provide them with a scientific or commercial competitive advantage over the researchers who had originally gathered the data or allow them to derive conclusions that are potentially at odds with those drawn in the original publication.
As members of a community that either produces or uses data, what approach do you think serves our community best? There is no need to reply to the Journal, but please read the interim report and let the IOM know how you feel about this and the many other critical issues related to data sharing that are reviewed in the document. The IOM is collecting comments until March 24, 2014, at www8.nationalacademies.org/cp/projectview.aspx?key=49578.”

Google Hangouts vs Twitter Q&As: how the US and Europe are hacking traditional diplomacy


Wired (UK): “We’re not yet sure if diplomacy is going digital or just the conversations we’re having,” Moira Whelan, Deputy Assistant Secretary for Digital Strategy, US Department of State, admitted on stage at TedxStockholm. “Sometimes you just have to dive in, and we’re going to, but we’re not really sure where we’re going.”
The US has been at the forefront of digital diplomacy for many years now. President Obama was the first leader to sign up to Twitter, and has amassed the greatest number of followers among his peers at nearly 41 million. The account is, however, mainly run by his staff. It’s understandable, but demonstrates that there still remains a diplomatic disconnect in a country Whelan says knows it’s “ready, leading the conversation and on cutting edge”.
In Europe  Swedish Minister for Foreign Affairs Carl Bildt, on the other hand, carries out regular Q&As on the social network and is regarded as one of the most conversational leaders on Twitter and the best connected, according to annual survey Twiplomacy. Our own William Hague is chasing Bildt with close to 200,000 followers, and is the world’s second most connected Foreign Minister, while David Cameron is active on a daily basis with more than 570,000 followers. London was in fact the first place to host a “Diplohack”, an event where ambassadors are brought together with developers and others to hack traditional diplomacy, and Whelan travelled to Sweden to take place in the third European event, the Stockholm Initiative for Digital Diplomacy held 16-17 January in conjunction with TedxStockholm.
Nevertheless, Whelan, who has worked for the state for a decade, says the US is in the game and ready to try new things. Case in point being its digital diplomacy reaction to the crisis in Syria last year.
“In August 2013 we witnessed tragic events in Syria, and obviously the President of the United States and his security team jumped into action,” said Whelan. “We needed to bear witness and… very clearly saw the need for one thing — a Google+ Hangout.” With her tongue-in-cheek comment, Whelan was pointing out social media’s incredibly relevant role in communicating to the public what’s going on when crises hit, and in answering concerns and questions through it.
“We saw speeches and very disturbing images coming at us,” continued Whelan. “We heard leaders making impassioned speeches, and we ourselves had conversations about what we were seeing and how we needed to engage and inform; to give people the chance to engage and ask questions of us.
“We thought, clearly let’s have a Google+ Hangout. Three people joined us and Secretary John Kerry — Nicholas Kirstof of the New York Times, executive editor of Syria Deeply, Lara Setrakian and Andrew Beiter, a teacher affiliated with the Holocaust Memorial Museum who specialises in how we talk about these topics with our children.”
In the run up to the Hangout, news of the event trickled out and soon Google was calling, asking if it could advertise the session at the bottom of other Hangouts, then on YouTube ads. “Suddenly 15,000 people were watching the Secretary live — that’s by far largest number we’d seen. We felt we’d tapped into something, we knew we’d hit success at what was a challenging time. We were engaging the public and could join with them to communicate a set of questions. People want to ask questions and get very direct answers, and we know it’s a success. We’ve talked to Google about how we can replicate that. We want to transform what we’re doing to make that the norm.”
Secretary of State John Kerry is, Whelan told Wired.co.uk later, “game for anything” when it comes to social media — and having the department leader enthused at the prospect of taking digital diplomacy forward is obviously key to its success.
“He wanted us to get on Instagram and the unselfie meme during the Philippines crisis was his idea — an assistant had seen it and he held a paper in front of him with the URL to donate funds to Typhoon Haiyan victims,” Whelan told Wired.co.uk at the Stockholm diplohack.  “President Obama came in with a mandate that social media would be present and pronounced in all our departments.”
“[As] government changes and is more influenced away from old paper models and newspapers, suspenders and bow ties, and more into young innovators wanting to come in and change things,” Whelan continued, “I think it will change the way we work and help us get smarter.”

Selected Readings on Big Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of big data was originally published in 2014.

Big Data refers to the wide-scale collection, aggregation, storage, analysis and use of data. Government is increasingly in control of a massive amount of raw data that, when analyzed and put to use, can lead to new insights on everything from public opinion to environmental concerns. The burgeoning literature on Big Data argues that it generates value by: creating transparency; enabling experimentation to discover needs, expose variability, and improve performance; segmenting populations to customize actions; replacing/supporting human decision making with automated algorithms; and innovating new business models, products and services. The insights drawn from data analysis can also be visualized in a manner that passes along relevant information, even to those without the tech savvy to understand the data on its own terms (see The GovLab Selected Readings on Data Visualization).

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Australian Government Information Management Office. The Australian Public Service Big Data Strategy: Improved Understanding through Enhanced Data-analytics Capability Strategy Report. August 2013. http://bit.ly/17hs2xY.

  • This Big Data Strategy produced for Australian Government senior executives with responsibility for delivering services and developing policy is aimed at ingraining in government officials that the key to increasing the value of big data held by government is the effective use of analytics. Essentially, “the value of big data lies in [our] ability to extract insights and make better decisions.”
  • This positions big data as a national asset that can be used to “streamline service delivery, create opportunities for innovation, identify new service and policy approaches as well as supporting the effective delivery of existing programs across a broad range of government operations.”

Bollier, David. The Promise and Peril of Big Data. The Aspen Institute, Communications and Society Program, 2010. http://bit.ly/1a3hBIA.

  • This report captures insights from the 2009 Roundtable exploring uses of Big Data within a number of important consumer behavior and policy implication contexts.
  • The report concludes that, “Big Data presents many exciting opportunities to improve modern society. There are incalculable opportunities to make scientific research more productive, and to accelerate discovery and innovation. People can use new tools to help improve their health and well-being, and medical care can be made more efficient and effective. Government, too, has a great stake in using large databases to improve the delivery of government services and to monitor for threats to national security.
  • However, “Big Data also presents many formidable challenges to government and citizens precisely because data technologies are becoming so pervasive, intrusive and difficult to understand. How shall society protect itself against those who would misuse or abuse large databases? What new regulatory systems, private-law innovations or social practices will be capable of controlling anti-social behaviors–and how should we even define what is socially and legally acceptable when the practices enabled by Big Data are so novel and often arcane?”

Boyd, Danah and Kate Crawford. “Six Provocations for Big Data.” A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society. September 2011http://bit.ly/1jJstmz.

  • In this paper, Boyd and Crawford raise challenges to unchecked assumptions and biases regarding big data. The paper makes a number of assertions about the “computational culture” of big data and pushes back against those who consider big data to be a panacea.
  • The authors’ provocations for big data are:
    • Automating Research Changes the Definition of Knowledge
    • Claims to Objectivity and Accuracy are Misleading
    • Big Data is not always Better Data
    • Not all Data is Equivalent
    • Just Because it is accessible doesn’t make it ethical
    • Limited Access to Big Data creates New Digital Divide

The Economist Intelligence Unit. Big Data and the Democratisation of Decisions. October 2012. http://bit.ly/17MpH8L.

  • This report from the Economist Intelligence Unit focuses on the positive impact of big data adoption in the private sector, but its insights can also be applied to the use of big data in governance.
  • The report argues that innovation can be spurred by democratizing access to data, allowing a diversity of stakeholders to “tap data, draw lessons and make business decisions,” which in turn helps companies and institutions respond to new trends and intelligence at varying levels of decision-making power.

Manyika, James, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers. Big Data: The Next Frontier for Innovation, Competition, and Productivity.  McKinsey & Company. May 2011. http://bit.ly/18Q5CSl.

  • This report argues that big data “will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, and that “leaders in every sector will have to grapple with the implications of big data.” 
  • The report offers five broad ways in which using big data can create value:
    • First, big data can unlock significant value by making information transparent and usable at much higher frequency.
    • Second, as organizations create and store more transactional data in digital form, they can collect more accurate and detailed performance information on everything from product inventories to sick days, and therefore expose variability and boost performance.
    • Third, big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.
    • Fourth, big sophisticated analytics can substantially improve decision-making.
    • Finally, big data can be used to improve the development of the next generation of products and services.

The Partnership for Public Service and the IBM Center for The Business of Government. “From Data to Decisions II: Building an Analytics Culture.” October 17, 2012. https://bit.ly/2EbBTMg.

  • This report discusses strategies for better leveraging data analysis to aid decision-making. The authors argue that, “Organizations that are successful at launching or expanding analytics program…systematically examine their processes and activities to ensure that everything they do clearly connects to what they set out to achieve, and they use that examination to pinpoint weaknesses or areas for improvement.”
  • While the report features many strategies for government decisions-makers, the central recommendation is that, “leaders incorporate analytics as a way of doing business, making data-driven decisions transparent and a fundamental approach to day-to-day management. When an analytics culture is built openly, and the lessons are applied routinely and shared widely, an agency can embed valuable management practices in its DNA, to the mutual benet of the agency and the public it serves.”

TechAmerica Foundation’s Federal Big Data Commission. “Demystifying Big Data: A Practical Guide to Transforming the Business of Government.” 2013. http://bit.ly/1aalUrs.

  • This report presents key big data imperatives that government agencies must address, the challenges and the opportunities posed by the growing volume of data and the value Big Data can provide. The discussion touches on the value of big data to businesses and organizational mission, presents case study examples of big data applications, technical underpinnings and public policy applications.
  • The authors argue that new digital information, “effectively captured, managed and analyzed, has the power to change every industry including cyber security, healthcare, transportation, education, and the sciences.” To ensure that this opportunity is realized, the report proposes a detailed big data strategy framework with the following steps: define, assess, plan, execute and review.

World Economic Forum. “Big Data, Big Impact: New Possibilities for International Development.” 2012. http://bit.ly/17hrTKW.

  • This report examines the potential for channeling the “flood of data created every day by the interactions of billions of people using computers, GPS devices, cell phones, and medical devices” into “actionable information that can be used to identify needs, provide services, and predict and prevent crises for the benefit of low-income populations”
  • The report argues that, “To realise the mutual benefits of creating an environment for sharing mobile-generated data, all ecosystem actors must commit to active and open participation. Governments can take the lead in setting policy and legal frameworks that protect individuals and require contractors to make their data public. Development organisations can continue supporting governments and demonstrating both the public good and the business value that data philanthropy can deliver. And the private sector can move faster to create mechanisms for the sharing data that can benefit the public.”

Predictive Modeling With Big Data: Is Bigger Really Better?


New Paper by Junqué de Fortuny, Enric, Martens, David, and Provost, Foster in Big Data :“With the increasingly widespread collection and processing of “big data,” there is natural interest in using these data assets to improve decision making. One of the best understood ways to use data to improve decision making is via predictive analytics. An important, open question is: to what extent do larger data actually lead to better predictive models? In this article we empirically demonstrate that when predictive models are built from sparse, fine-grained data—such as data on low-level human behavior—we continue to see marginal increases in predictive performance even to very large scale. The empirical results are based on data drawn from nine different predictive modeling applications, from book reviews to banking transactions. This study provides a clear illustration that larger data indeed can be more valuable assets for predictive analytics. This implies that institutions with larger data assets—plus the skill to take advantage of them—potentially can obtain substantial competitive advantage over institutions without such access or skill. Moreover, the results suggest that it is worthwhile for companies with access to such fine-grained data, in the context of a key predictive task, to gather both more data instances and more possible data features. As an additional contribution, we introduce an implementation of the multivariate Bernoulli Naïve Bayes algorithm that can scale to massive, sparse data.”

Big Data and the Future of Privacy


John Podesta at the White House blog: “Last Friday, the President spoke to the American people, and the international community, about how to keep us safe from terrorism in a changing world while upholding America’s commitment to liberty and privacy that our values and Constitution require. Our national security challenges are real, but that is surely not the only space where changes in technology are altering the landscape and challenging conceptions of privacy.
That’s why in his speech, the President asked me to lead a comprehensive review of the way that “big data” will affect the way we live and work; the relationship between government and citizens; and how public and private sectors can spur innovation and maximize the opportunities and free flow of this information while minimizing the risks to privacy. I will be joined in this effort by Secretary of Commerce Penny Pritzker, Secretary of Energy Ernie Moniz, the President’s Science Advisor John Holdren, the President’s Economic Advisor Gene Sperling and other senior government officials.
I would like to explain a little bit more about the review, its scope, and what you can expect over the next 90 days.
We are undergoing a revolution in the way that information about our purchases, our conversations, our social networks, our movements, and even our physical identities are collected, stored, analyzed and used. The immense volume, diversity and potential value of data will have profound implications for privacy, the economy, and public policy. The working group will consider all those issues, and specifically how the present and future state of these technologies might motivate changes in our policies across a range of sectors.
When we complete our work, we expect to deliver to the President a report that anticipates future technological trends and frames the key questions that the collection, availability, and use of “big data” raise – both for our government, and the nation as a whole. It will help identify technological changes to watch, whether those technological changes are addressed by the U.S.’s current policy framework and highlight where further government action, funding, research and consideration may be required.
This is going to be a collaborative effort. The President’s Council of Advisors on Science and Technology (PCAST) will conduct a study to explore in-depth the technological dimensions of the intersection of big data and privacy, which will feed into this broader effort. Our working group will consult with industry, civil liberties groups, technologists, privacy experts, international partners, and other national and local government officials on the significance of and future for these technologies. Finally, we will be working with a number of think tanks, academic institutions, and other organizations around the country as they convene stakeholders to discuss these very issues and questions. Likewise, many abroad are analyzing and responding to the challenge and seizing the opportunity of big data. These discussions will help to inform our study.
While we don’t expect to answer all these questions, or produce a comprehensive new policy in 90 days, we expect this work to serve as the foundation for a robust and forward-looking plan of action. Check back on this blog for updates on how you can get involved in the debate and for status updates on our progress.”

How Government Can Make Open Data Work


Joel Gurin in Information Week: “At the GovLab at New York University, where I am senior adviser, we’re taking a different approach than McKinsey’s to understand the evolving value of government open data: We’re studying open data companies from the ground up. I’m now leading the GovLab’s Open Data 500 project, funded by the John S. and James L. Knight Foundation, to identify and examine 500 American companies that use government open data as a key business resource.
Our preliminary results show that government open data is fueling companies both large and small, across the country, and in many sectors of the economy, including health, finance, education, energy, and more. But it’s not always easy to use this resource. Companies that use government open data tell us it is often incomplete, inaccurate, or trapped in hard-to-use systems and formats.
It will take a thorough and extended effort to make government data truly useful. Based on what we are hearing and the research I did for my book, here are some of the most important steps the federal government can take, starting now, to make it easier for companies to add economic value to the government’s data.
1. Improve data quality
The Open Data Policy not only directs federal agencies to release more open data; it also requires them to release information about data quality. Agencies will have to begin improving the quality of their data simply to avoid public embarrassment. We can hope and expect that they will do some data cleanup themselves, demand better data from the businesses they regulate, or use creative solutions like turning to crowdsourcing for help, as USAID did to improve geospatial data on its grantees.
 
 

2. Keep improving open data resources
The government has steadily made Data.gov, the central repository of federal open data, more accessible and useful, including a significant relaunch last week. To the agency’s credit, the GSA, which administers Data.gov, plans to keep working to make this key website still better. As part of implementing the Open Data Policy, the administration has also set up Project Open Data on GitHub, the world’s largest community for open-source software. These resources will be helpful for anyone working with open data either inside or outside of government. They need to be maintained and continually improved.
3. Pass DATA
The Digital Accountability and Transparency Act would bring transparency to federal government spending at an unprecedented level of detail. The Act has strong bipartisan support. It passed the House with only one dissenting vote and was unanimously approved by a Senate committee, but still needs full Senate approval and the President’s signature to become law. DATA is also supported by technology companies who see it as a source of new open data they can use in their businesses. Congress should move forward and pass DATA as the logical next step in the work that the Obama administration’s Open Data Policy has begun.
4. Reform the Freedom of Information Act
Since it was passed in 1966, the federal Freedom of Information Act has gone through two major revisions, both of which strengthened citizens’ ability to access many kinds of government data. It’s time for another step forward. Current legislative proposals would establish a centralized web portal for all federal FOIA requests, strengthen the FOIA ombudsman’s office, and require agencies to post more high-interest information online before they receive formal requests for it. These changes could make more information from FOIA requests available as open data.
5. Engage stakeholders in a genuine way
Up to now, the government’s release of open data has largely been a one-way affair: Agencies publish datasets that they hope will be useful without consulting the organizations and companies that want to use it. Other countries, including the UK, France, and Mexico, are building in feedback loops from data users to government data providers, and the US should, too. The Open Data Policy calls for agencies to establish points of contact for public feedback. At the GovLab, we hope that the Open Data 500 will help move that process forward. Our research will provide a basis for new, productive dialogue between government agencies and the businesses that rely on them.
6. Keep using federal challenges to encourage innovation
The federal Challenge.gov website applies the best principles of crowdsourcing and collective intelligence. Agencies should use this approach extensively, and should pose challenges using the government’s open data resources to solve business, social, or scientific problems. Other approaches to citizen engagement, including federally sponsored hackathons and the White House Champions of Change program, can play a similar role.
Through the Open Data Policy and other initiatives, the Obama administration has set the right goals. Now it’s time to implement and move toward what US CTO Todd Park calls “data liberation.” Thousands of companies, organizations, and individuals will benefit.”