Safety Datapalooza Shows Power of Data.gov Communities


Lisa Nelson at DigitalGov: “The White House Office of Public Engagement held the first Safety Datapalooza illustrating the power of Data.gov communities. Federal Chief Technology Officer Todd Park and Deputy Secretary of Transportation John Porcari hosted the event, which touted the data available on Safety.Data.gov and the community of innovators using it to make effective tools for consumers.
The event showcased many of the  tools that have been produced as a result of  opening this safety data including:

  • PulsePoint, from the San Ramon Fire Protection District, a lifesaving mobile app that allows CPR-trained volunteers to be notified if someone nearby is in need of emergency assistance;
  • Commute and crime maps, from Trulia, allow home buyers to choose their new residence based on two important everyday factors; and
  • Hurricane App, from the American Red Cross, to monitor storm conditions, prepare your family and home, find help, and let others know you’re safe even if the power is out;

Safety data is far from alone in generating innovative ideas and gathering a community of developers and entrepreneurs, Data.gov currently has 16 different topically diverse communities on land and sea — the Cities and Oceans communities being two such examples. Data.gov’s communities are a virtual meeting spot for interested parties across government, academia and industry to come together and put the data to use. Data.gov enables a whole set of tools to make these communities come to life: apps, blogs, challenges, forums, ranking, rating and wikis.
For a summary of the Safety Datapalooza visit Transportation’s “Fast Lane” blog.”

The Failure and the Promise of Public Participation


Dr. Mark Funkhouser in Governing: “In a recent study entitled Making Public Participation Legal, Matt Leighninger cites a Knight Foundation report that found that attending a public meeting was more likely to reduce a person’s sense of efficacy and attachment to the community than to increase it. That sad fact is no surprise to the government officials who have to run — and endure — public meetings.
Every public official who has served for any length of time has horror stories about these forums. The usual suspects show up — the self-appointed activists (who sometimes seem to be just a little nuts) and the lobbyists. Regular folks have made the calculation that only in extreme circumstance, when they are really scared or angry, is attending a public hearing worth their time. And who can blame them when it seems clear that the game is rigged, the decisions already have been made, and they’ll probably have to sit through hours of blather before they get their three minutes at the microphone?
So much transparency and yet so little trust. Despite the fact that governments are pumping out more and more information to citizens, trust in government has edged lower and lower, pushed in part no doubt by the lingering economic hardships and government cutbacks resulting from the recession. Most public officials I talk to now take it as an article of faith that the public generally disrespects them and the governments they work for.
Clearly the relationship between citizens and their governments needs to be reframed. Fortunately, over the last couple of decades lots of techniques have been developed by advocates of deliberative democracy and citizen participation that provide both more meaningful engagement and better community outcomes. There are decision-making forums, “visioning” forums and facilitated group meetings, most of which feature some combination of large-group, small-group and online interactions.
But here’s the rub: Our legal framework doesn’t support these new methods of public participation. This fact is made clear in Making Public Participation Legal, which was compiled by a working group that included people from the National Civic League, the American Bar Association, the International City/County Management Association and a number of leading practitioners of public participation.
The requirements for public meetings in local governments are generally built into state statutes such as sunshine or open-meetings laws or other laws governing administrative procedures. These laws may require public hearings in certain circumstances and mandate that advance notice, along with an agenda, be posted for any meeting of an “official body” — from the state legislature to a subcommittee of the city council or an advisory board of some kind. And a “meeting” is one in which a quorum attends. So if three of a city council’s nine members sit on the finance committee and two of the committee members happen to show up at a public meeting, they may risk having violated the open-meetings law…”

Why the Nate Silvers of the World Don’t Know Everything


Felix Salmon in Wired: “This shift in US intelligence mirrors a definite pattern of the past 30 years, one that we can see across fields and institutions. It’s the rise of the quants—that is, the ascent to power of people whose native tongue is numbers and algorithms and systems rather than personal relationships or human intuition. Michael Lewis’ Moneyball vividly recounts how the quants took over baseball, as statistical analy­sis trumped traditional scouting and propelled the underfunded Oakland A’s to a division-winning 2002 season. More recently we’ve seen the rise of the quants in politics. Commentators who “trusted their gut” about Mitt Romney’s chances had their gut kicked by Nate Silver, the stats whiz who called the election days before­hand as a lock for Obama, down to the very last electoral vote in the very last state.
The reason the quants win is that they’re almost always right—at least at first. They find numerical patterns or invent ingenious algorithms that increase profits or solve problems in ways that no amount of subjective experience can match. But what happens after the quants win is not always the data-driven paradise that they and their boosters expected. The more a field is run by a system, the more that system creates incentives for everyone (employees, customers, competitors) to change their behavior in perverse ways—providing more of whatever the system is designed to measure and produce, whether that actually creates any value or not. It’s a problem that can’t be solved until the quants learn a little bit from the old-fashioned ways of thinking they’ve displaced.
No matter the discipline or industry, the rise of the quants tends to happen in four stages. Stage one is what you might call pre-disruption, and it’s generally best visible in hindsight. Think about quaint dating agencies in the days before the arrival of Match .com and all the other algorithm-powered online replacements. Or think about retail in the era before floor-space management analytics helped quantify exactly which goods ought to go where. For a live example, consider Hollywood, which, for all the money it spends on market research, is still run by a small group of lavishly compensated studio executives, all of whom are well aware that the first rule of Hollywood, as memorably summed up by screenwriter William Goldman, is “Nobody knows anything.” On its face, Hollywood is ripe for quantifi­cation—there’s a huge amount of data to be mined, considering that every movie and TV show can be classified along hundreds of different axes, from stars to genre to running time, and they can all be correlated to box office receipts and other measures of profitability.
Next comes stage two, disruption. In most industries, the rise of the quants is a recent phenomenon, but in the world of finance it began back in the 1980s. The unmistakable sign of this change was hard to miss: the point at which you started getting targeted and personalized offers for credit cards and other financial services based not on the relationship you had with your local bank manager but on what the bank’s algorithms deduced about your finances and creditworthiness. Pretty soon, when you went into a branch to inquire about a loan, all they could do was punch numbers into a computer and then give you the computer’s answer.
For a present-day example of disruption, think about politics. In the 2012 election, Obama’s old-fashioned campaign operatives didn’t disappear. But they gave money and freedom to a core group of technologists in Chicago—including Harper Reed, former CTO of the Chicago-based online retailer Threadless—and allowed them to make huge decisions about fund-raising and voter targeting. Whereas earlier campaigns had tried to target segments of the population defined by geography or demographic profile, Obama’s team made the campaign granular right down to the individual level. So if a mom in Cedar Rapids was on the fence about who to vote for, or whether to vote at all, then instead of buying yet another TV ad, the Obama campaign would message one of her Facebook friends and try the much more effective personal approach…
After disruption, though, there comes at least some version of stage three: over­shoot. The most common problem is that all these new systems—metrics, algo­rithms, automated decisionmaking processes—result in humans gaming the system in rational but often unpredictable ways. Sociologist Donald T. Campbell noted this dynamic back in the ’70s, when he articulated what’s come to be known as Campbell’s law: “The more any quantitative social indicator is used for social decision-making,” he wrote, “the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”…
Policing is a good example, as explained by Harvard sociologist Peter Moskos in his book Cop in the Hood: My Year Policing Baltimore’s Eastern District. Most cops have a pretty good idea of what they should be doing, if their goal is public safety: reducing crime, locking up kingpins, confiscating drugs. It involves foot patrols, deep investigations, and building good relations with the community. But under statistically driven regimes, individual officers have almost no incentive to actually do that stuff. Instead, they’re all too often judged on results—specifically, arrests. (Not even convictions, just arrests: If a suspect throws away his drugs while fleeing police, the police will chase and arrest him just to get the arrest, even when they know there’s no chance of a conviction.)…
It’s increasingly clear that for smart organizations, living by numbers alone simply won’t work. That’s why they arrive at stage four: synthesis—the practice of marrying quantitative insights with old-fashioned subjective experience. Nate Silver himself has written thoughtfully about examples of this in his book, The Signal and the Noise. He cites baseball, which in the post-Moneyball era adopted a “fusion approach” that leans on both statistics and scouting. Silver credits it with delivering the Boston Red Sox’s first World Series title in 86 years. Or consider weather forecasting: The National Weather Service employs meteorologists who, understanding the dynamics of weather systems, can improve forecasts by as much as 25 percent compared with computers alone. A similar synthesis holds in eco­nomic forecasting: Adding human judgment to statistical methods makes results roughly 15 percent more accurate. And it’s even true in chess: While the best computers can now easily beat the best humans, they can in turn be beaten by humans aided by computers….
That’s what a good synthesis of big data and human intuition tends to look like. As long as the humans are in control, and understand what it is they’re controlling, we’re fine. It’s when they become slaves to the numbers that trouble breaks out. So let’s celebrate the value of disruption by data—but let’s not forget that data isn’t everything.

From Faith-Based to Evidence-Based: The Open Data 500 and Understanding How Open Data Helps the American Economy


Beth Noveck in Forbes: “Public funds have, after all, paid for their collection, and the law says that federal government data are not protected by copyright. By the end of 2009, the US and the UK had the only two open data one-stop websites where agencies could post and citizens could find open data. Now there are over 300 such portals for government data around the world with over 1 million available datasets. This kind of Open Data — including weather, safety and public health information as well as information about government spending — can serve the country by increasing government efficiency, shedding light on regulated industries, and driving innovation and job creation.

It’s becoming clear that open data has the potential to improve people’s lives. With huge advances in data science, we can take this data and turn it into tools that help people choose a safer hospital, pick a better place to live, improve the performance of their farm or business by having better climate models, and know more about the companies with whom they are doing business. Done right, people can even contribute data back, giving everyone a better understanding, for example of nuclear contamination in post-Fukushima Japan or incidences of price gouging in America’s inner cities.

The promise of open data is limitless. (see the GovLab index for stats on open data) But it’s important to back up our faith with real evidence of what works. Last September the GovLab began the Open Data 500 project, funded by the John S. and James L. Knight Foundation, to study the economic value of government Open Data extensively and rigorously.  A recent McKinsey study pegged the annual global value of Open Data (including free data from sources other than government), at $3 trillion a year or more. We’re digging in and talking to those companies that use Open Data as a key part of their business model. We want to understand whether and how open data is contributing to the creation of new jobs, the development of scientific and other innovations, and adding to the economy. We also want to know what government can do better to help industries that want high quality, reliable, up-to-date information that government can supply. Of those 1 million datasets, for example, 96% are not updated on a regular basis.

The GovLab just published an initial working list of 500 American companies that we believe to be using open government data extensively.  We’ve also posted in-depth profiles of 50 of them — a sample of the kind of information that will be available when the first annual Open Data 500 study is published in early 2014. We are also starting a similar study for the UK and Europe.

Even at this early stage, we are learning that Open Data is a valuable resource. As my colleague Joel Gurin, author of Open Data Now: the Secret to Hot Start-Ups, Smart Investing, Savvy Marketing and Fast Innovation, who directs the project, put it, “Open Data is a versatile and powerful economic driver in the U.S. for new and existing businesses around the country, in a variety of ways, and across many sectors. The diversity of these companies in the kinds of data they use, the way they use it, their locations, and their business models is one of the most striking things about our findings so far.” Companies are paradoxically building value-added businesses on top of public data that anyone can access for free….”

FULL article can be found here.

The future of law and legislation?


prior probability: “Mike Gatto, a legislator in California, recently set up the world’s first Wiki-bill in order to enable private citizens to act as cyber-legislators and help draft an actual law. According to Assemblyman Gatto:

Government has a responsibility to listen to the people and to enable everyone to be an active part of the legislative process. That’s why I’ve created this space for you to draft real legislation. Just like a Wikipedia entry, you can see what the current draft is, and propose minor or major edits. The marketplace of ideas will decide the final draft. We’re starting with a limited topic: probate. Almost everyone will face the prospect of working through the details of a deceased loved one’s finances and estate at some point during their life. I want to hear your ideas for how to make this process less burdensome.”

Lessons in the crowdsourced verification of news from Storyful and Reddit’s Syria forum


at GigaOm: “One of the most powerful trends in media over the past year is the crowdsourced verification of news, whether it’s the work of a blogger like Brown Moses or former NPR journalist Andy Carvin. Two other interesting efforts in this area are the “open newsroom” approach taken by Storyful — which specializes in verifying social-media reports for mainstream news entities — and a Reddit forum devoted to crowdsourcing news coverage of the civil war in Syria.
Storyful journalist Joe Galvin recently looked at some of the incidents that the company has helped either debunk or verify over the past year — including a fake tweet from the official account of the Associated Press about explosions at the White House (which sent the Dow Jones index plummeting before it was corrected), a claim from Russian authorities that a chemical attack in Syria had been pre-meditated, and a report from investigative journalist Seymour Hersh about the same attack that questioned whether the government had been involved….
Reddit, meanwhile, has been conducting some “open newsroom”-style experiments of its own around a number of news events, including the Syrian civil war. The site has come under fire in the past for some of those efforts — including the attempt to identify the bombers in the Boston bombings case, which went badly awry — but the Syrian thread in particular is a good example of how a smart aggregator can make sense of an ongoing news event. In a recent post at a site called Dissected News, one of the moderators behind the /r/SyrianCivilWar sub-Reddit — a 22-year-old law student named Christopher Kingdon (or “uptodatepronto” as he is known on the site) — wrote about his experiences with the forum, which is trying to be a broadly objective source for breaking news and information about the conflict….
Some of what the moderators do in the forum is similar to the kind of verification that Storyful or the BBC’s “user-generated content desk” do — checking photos and video for obvious signs of fakery and hoaxes. But Kingdon also describes how much effort his team of volunteers puts into ensuring that the sub-Reddit doesn’t degenerate into trolling or flame-wars. Strict rules are enforced “to prevent personal attacks, offensive and violent language and racism” and the moderators favor posts that “utilize sources, background information and a dash of common sense.”

Ten thoughts for the future


The Economist: “CASSANDRA has decided to revisit her fellow forecasters Thomas Malnight and Tracey Keys to find out what their predictions are for 2014. Once again they have produced a collection of trends for the year ahead, in their “Global Trends Report”.
The possibilities of mind control seem alarming ( point 6) as do the  implications of growing income inequality (point 10). Cassandra also hopes that “unemployability” and “unemployerability”, as discussed in point 9, are contested next year (on both linguistic and social fronts).
Nevertheless, the forecasts make for intriguing reading and highlights appear below.
 1. From social everything to being smart socially
Social technologies are everywhere, but these vast repositories of digital “stuff” bury the exceptional among the unimportant. It’s time to get socially smart. Users are moving to niche networks to bring back the community feel and intelligence to social interactions. Businesses need to get smarter about extracting and delivering value from big data including challenging business models. For social networks, mobile is the great leveller. Competition for attention with other apps will intensify the battle to own key assets from identity to news sharing, demanding radical reinvention.
2. Information security: The genie is out of the bottle
Thought your information was safe? Think again. The information security genie is out of the bottle as cyber-surveillance and data mining by public and private organizations increases – and don’t forget criminal networks and whistleblowers. It will be increasingly hard to tell friend from foe in cyberspace as networks build artificial intelligence to decipher your emotions and smart cities track your every move. Big brother is here: Protecting identity, information and societies will be a priority for all.
3. Who needs shops anyway?
Retailers are facing a digitally driven perfect storm. Connectivity, rising consumer influence, time scarcity, mobile payments, and the internet of things, are changing where, when and how we shop – if smart machines have not already done the job. Add the sharing economy, driven by younger generations where experience and sustainable consumption are more important than ownership, and traditional retail models break down. The future of shops will be increasingly defined by experiential spaces offering personalized service, integrated online and offline value propositions, and pop-up stores to satisfy demands for immediacy and surprise.
4. Redistributing the industrial revolution
Complex, global value chains are being redistributed by new technologies, labour market shifts and connectivity. Small-scale manufacturing, including 3D and soon 4D printing, and shifting production economics are moving production closer to markets and enabling mass customization – not just by companies but by the tech-enabled maker movement which is going mainstream. Rising labour costs in developing markets, high unemployment in developed markets, global access to online talent and knowledge, plus advances in robotics mean reshoring of production to developed markets will increase. Mobility, flexibility and networks will define the future industrial landscape.
5. Hubonomics: The new face of globalization
As production and consumption become more distributed, hubs will characterize the next wave of “globalization.” They will specialize to support the needs of growing regional trade, emerging city states, on-line communities of choice, and the next generation of flexible workers and entrepreneurs. Underpinning these hubs will be global knowledge networks and new business and governance models based on hubonomics™, that leverage global assets and hub strengths to deliver local value.
6. Sci-Fi is here: Making the impossible possible
Cross-disciplinary approaches and visionary entrepreneurs are driving scientific breakthroughs that could change not just our lives and work but our bodies and intelligence. Labs worldwide are opening up the vast possibilities of mind control and artificial intelligence, shape-shifting materials and self-organizing nanobots, cyborgs and enhanced humans, space exploration, and high-speed, intelligent transportation. Expect great debate around the ethics, financing, and distribution of public and private benefits of these advances – and the challenge of translating breakthroughs into replicable benefits.
7. Growing pains: Transforming markets and generations
The BRICS are succumbing to Newton’s law of gravitation: Brazil’s lost it, India’s losing it, China’s paying the price for growth, Russia’s failing to make a superpower come-back, and South Africa’s economy is in disarray. In other developing markets currencies have tumbled, Arab Spring governments are still in turmoil and social unrest is increasing along with the number of failing states. But the BRICS & Beyond growth engine is far from dead. Rather it is experiencing growing pains which demand significant shifts in governance, financial systems, education and economic policies to catch up. The likely transformers will be younger generations who aspire to greater freedom and quality of life than their parents.
8. Panic versus denial: The resource gap grows, the global risks rise – but who is listening?
The complex nexus of food, water, energy and climate change presents huge global economic, environmental and societal challenges – heating up the battle to access new resources from the Arctic to fracking. Risks are growing, even as multilateral action stalls. It’s a crisis of morals, governance, and above all marketing and media, pitting crisis deniers against those who recognize the threats but are communicating panic versus reasoned solutions. Expect more debate and calls for responsible capitalism – those that are listening will be taking action at multiple levels in society and business.
9. Fighting unemployability and unemployerability
Companies are desperate for talented workers – yet unemployment rates remain high. Polarization towards higher and lower skill levels is squeezing mid-level jobs, even as employers complain that education systems are not preparing students for the jobs of the future. Fighting unemployability is driving new government-business partnerships worldwide, and will remain a critical issue given massive youth unemployment. Employers must also focus on organizational unemployerability – not being able to attract and retain desired talent – as new generations demand exciting and meaningful work where they can make an impact. If they can’t find it, they will quickly move on or swell the growing ranks of young entrepreneurs.
10. Surviving in a bipolar world: From expecting consistency to embracing ambiguity
Life is not fair, nor is it predictable.  Income inequality is growing. Intolerance and nationalism are rising but interdependence is the currency of a connected world. Pressure on leaders to deliver results today is intense but so too is the need for fundamental change to succeed in the long term. The contradictions of leadership and life are increasing faster than our ability to reconcile the often polarized perspectives and values each embodies. Increasingly, they are driving irrational acts of leadership (think the US debt ceiling), geopolitical, social and religious tensions, and individual acts of violence. Surviving in this world will demand stronger, responsible leadership comfortable with and capable of embracing ambiguity and uncertainty, as opposed to expecting consistency and predictability.”

Continued Progress: Engaging Citizen Solvers through Prizes


Blog post by Cristin Dorgelo: “Today OSTP released its second annual comprehensive report detailing the use of prizes and competitions by Federal agencies to spur innovation and solve Grand Challenges. Those efforts have expanded in the last two years under the America COMPETES Reauthorization Act of 2010, which granted all Federal agencies the authority to conduct prize competitions to spur innovation, solve tough problems, and advance their core missions.
This year’s report details the remarkable benefits the Federal Government reaped in Fiscal Year (FY) 2012 from more than 45 prize competitions across 10 agencies. To date, nearly 300 prize competitions have been implemented by 45 agencies through the website Challenge.gov.
Over the past four years, the Obama Administration has taken important steps to make prizes a standard tool in every agency’s toolbox. In his September 2009 Strategy for American Innovation, President Obama called on all Federal agencies to increase their use of prizes to address some of our Nation’s most pressing challenges. In March 2010, the Office of Management and Budget issued a policy framework to guide agencies in using prizes to mobilize American ingenuity and advance their respective core missions. Then, in September 2010, the Administration launched Challenge.gov, a one-stop shop where entrepreneurs and citizen solvers can find public-sector prize competitions.
The prize authority in COMPETES is a key piece of this effort. By giving agencies a clear legal path and expanded authority to deploy competitions and challenges, the legislation makes it dramatically easier for agencies to enlist this powerful approach to problem-solving and to pursue ambitious prizes with robust incentives…
To support these ongoing efforts, the General Services Administration  continues to train agencies about resources and vendors available to help them administer prize competitions. In addition, NASA’s Center of Excellence for Collaborative Innovation (CoECI) provides other agencies with a full suite of services for incentive prize pilots – from prize design, through implementation, to post-prize evaluation”

Buenos Aires, A Pocket of Civic Innovation in Argentina


Rebecca Chao in TechPresident: “…In only a few years, the government, civil society and media in Buenos Aires have actively embraced open data. The Buenos Aires city government has been publishing data under a creative commons license and encouraging civic innovation through hackathons. NGOs have launched a number of tech-driven tools and Argentina’s second largest newspaper, La Nación, has published several hard-hitting data journalism projects. The result is a fledgling but flourishing open data culture in Buenos Aires, in a country that has not yet adopted a freedom of information law.

A Wikipedia for Open Government Data

In late August of this year, the Buenos Aires government declared a creative commons license for all of its digital content, which allows it be used for free, like Wikipedia content, with proper attribution. This applies to their new open data catalog that allows users to visualize the data, examine apps that have been created using the data and even includes a design lab for posting app ideas. Launched only in March, the government has already published fairly substantial data sets, including the salaries of city officials. The website also embodies the principals of openness in its design; it is built with open-source software and its code is available for reuse via GitHub.
“We were the first city in Argentina doing open government,” Rudi Borrmann tells techPresident over Skype. Borrmann is the Director of Buenos Aires’ Open Government Initiative. Previously, he was the social media editor at the city’s New Media Office but he also worked for many years in digital media…
While the civil society and media sectors have forged ahead in using open data, Borrmann tells techPresident that up in the ivory tower, openness to open data has been lagging. “Only technical schools are starting to create areas focused on working on open data,” he says.
In an interview with NYU’s govlab, Borrmann explained the significance of academia in using and pushing for more open data. “They have the means, the resources, the methodology to analyze…because in government you don’t have that time to analyze,” he said.
Another issue with open data is getting other branches of the government to modernize. Borrmann says that a lot of the Open Government’s work is done behind the scenes. “In general, you have very poor IT infrastructure all over Latin America” that interferes with the gathering and publishing of data, he says. “So in some cases it’s not about publishing or not publishing,” but about “having robust infrastructure for the information.”
It seems that the behind the scenes work is bearing some fruit. Just last week, on Dec. 6, the team behind the Buenos Aires open data website launched an impressive, interactive timeline, based on a similar timelapse map developed by a 2013 Knight-Mozilla Fellow, Noah Veltman. Against faded black and white photos depicting the subway from different decades over the last century, colorful pops of the Subterráneo lines emerge alongside factoids that go all the way back to 1910.”

Data isn't a four-letter word


Speech by Neelie Kroes, Vice-President of the European Commission responsible for the Digital Agenda: “I want to talk about data too: the opportunity as well as the threat.
Making data the engine of the European economy: safeguarding fundamental rights capturing the data boost, and strengthening our defences.
Data is at a cross-roads. We have opportunities; open data, big data, datamining, cloud computing. Tim Berners Lee, creator of the world wide web, saw the massive potential of open data. As he put it, if you put that data online, it will be used by other people to do wonderful things, in ways that you could never imagine.
On the other hand, we have threats: to our privacy and our values, and to the openness that makes it possible to innovate, trade and exchange.
Get it right and we can safeguard a better economic future. Get it wrong, and we cut competitiveness without protecting privacy. So we remain dependent on the digital developments of others: and just as vulnerable to them.
How do we find that balance? Not with hysteria; nor by paralysis. Not by stopping the wonderful things, simply to prevent the not-so-wonderful. Not by seeing data as a dirty word.
We are seeing a whole economy develop around data and cloud computing. Businesses using them, whole industries depending on them, data volumes are increasing exponentially. Data is not just an economic sideshow, it is a whole new asset class; requiring new skills and creating new jobs.
And with a huge range of applications. From decoding human genes to predicting the traffic, and even the economy. Whatever you’re doing these days, chances are you’re using big data (like translation, search, apps, etc).
There is increasing recognition of the data boost on offer. For example, open data can make public administrations more transparent and stimulate a rich innovative market. That is what the G8 Leaders recognised in June, with their Open Data Charter. For scientists too, open data and open access offer new ways to research and progress.
That is a philosophy the Commission has shared for some time. And that is what our ‘Open Data’ package of December 2011 is all about. With new EU laws to open up public administrations, and a new EU Open Data Portal. And all EU-funded scientific publications available under open access.
Now not just the G8 and the Commission are seeing this data opportunity: but the European Council too. Last October, they recognised the potential of big data innovation, the need for a single market in cloud computing; and the urgency of Europe capitalising on both.
We will be acting on that. Next spring, I plan a strategic agenda for research on data. Working with private partners and national research funders to shape that agenda, and get the most bang for our research euro.
And, beyond research, there is much we can do to align our work and support secure big data. From training skilled workers, to modernising copyright for data and text mining, to different actors in the value chain working together: for example through a public-private partnership.
…Empowering people is not always easy in this complex online world. I want to see technical solutions emerge that can do that, give users control over their desired level of privacy, how their data will be used, and making it easier to verify online rights are respected.
How can we do that? How can we ensure systems that are empowering, transparent, and secure? There are a number of subtleties in play. Here’s my take.
First, companies engaged in big data will need to start thinking about privacy protection at every stage: and from system development, to procedures and practices.
This is the principle of “privacy by design”, set out clearly in the proposed Data Protection Regulation. In other words, from now on new business ideas have two purposes: delivering a service and protecting privacy at the right level.
Second, also under the regulation, big data applications that might put fundamental rights at risk would require the company to carry out a “Privacy Impact Assessment”. This is another good way to combine innovation and privacy: ensuring you think about any risks from the start.
Third, sometimes, particularly for personal data, a company might realise they need user consent. Consent is a cornerstone of data protection rules, and should stay that way.
But we need to get smart, and apply common sense to consent. Users can’t be expected to know everything. Nor asked to consent to what they cannot realistically understand. Nor presented with false dilemmas, a black-and-white choice between consenting or getting shut out of services.
Fourth, we can also get smart when it comes to anonymisation. Sometimes, full anonymisation means losing important information, so you can no longer make the links between data. That could make the difference between progress or paralysis. But using pseudonyms can let you to analyse large amounts of data: to spot, for example, that people with genetic pattern X also respond well to therapy Y.
So it is understandable why the European Parliament has proposed a more flexible data protection regime for this type of data. Companies would be able to process the data on grounds of legitimate interest, rather than consent. That could make all the positive difference to big data: without endangering privacy.
Of course, in those cases, companies still to minimise privacy risks. Their internal processes and risk assessments must show how they comply with the guiding principles of data protection law. And – if something does go wrong – the company remains accountable.
Indeed company accountability is another key element of our proposal. And here again we welcome the European Parliament’s efforts to reinforce that. Clearly, you might assure accountability in different ways for different companies. But standards for compliance and processes could make a real difference.
A single data protection law for Europe would be a big step forward. National fortresses and single market barriers just make it harder for Europe to lead in digital, harder for Europe to become the natural home of secure online services. Data protection cannot mean data protectionism. Rather, it means safeguarding privacy does not come at the expense of innovation: with laws both flexible and future proof, pragmatic and proportionate, for a changing world….
But data protection rules are really just the start. They are only part of our response to the Snowden revelations….”