Rethinking Personal Data: A New Lens for Strengthening Trust


New report from the World Economic Forum: “As we look at the dynamic change shaping today’s data-driven world, one thing is becoming increasingly clear. We really do not know that much about it. Polarized along competing but fundamental principles, the global dialogue on personal data is inchoate and pulled in a variety of directions. It is complicated, conflated and often fueled by emotional reactions more than informed understandings.
The World Economic Forum’s global dialogue on personal data seeks to cut through this complexity. A multi-year initiative with global insights from the highest levels of leadership from industry, governments, civil society and academia, this work aims to articulate an ascendant vision of the value a balanced and human-centred personal data ecosystem can create.
Yet despite these aspirations, there is a crisis in trust. Concerns are voiced from a variety of viewpoints at a variety of scales. Industry, government and civil society are all uncertain on how to create a personal data ecosystem that is adaptive, reliable, trustworthy and fair.
The shared anxieties stem from the overwhelming challenge of transitioning into a hyperconnected world. The growth of data, the sophistication of ubiquitous computing and the borderless flow of data are all outstripping the ability to effectively govern on a global basis. We need the means to effectively uphold fundamental principles in ways fit for today’s world.
Yet despite the size and scope of the complexity, it cannot become a reason for inaction. The need for pragmatic and scalable approaches which strengthen transparency, accountability and the empowerment of individuals has become a global priority.
Tools are needed to answer fundamental questions: Who has the data? Where is the data? What is being done with it? All of these uncertainties need to be addressed for meaningful progress to occur.
Objectives need to be set. The benefits and harms for using personal data need be more precisely defined. The ambiguity surrounding privacy needs to be demystified and placed into a real-world context.
Individuals need to be meaningfully empowered. Better engagement over how data is used by third parties is one opportunity for strengthening trust. Supporting the ability for individuals to use personal data for their own purposes is another area for innovation and growth. But combined, the overall lack of engagement is undermining trust.
Collaboration is essential. The need for interdisciplinary collaboration between technologists, business leaders, social scientists, economists and policy-makers is vital. The complexities for delivering a sustainable and balanced personal data ecosystem require that these multifaceted perspectives are all taken into consideration.
With a new lens for using personal data, progress can occur.

Figure 1: A new lens for strengthening trust
 

Source: World Economic Forum

Obama Signs Nation's First 'Open Data' Law


William Welsh in Information Week: “President Barack Obama enacted the nation’s first open data law, signing into law on May 9 bipartisan legislation that requires federal agencies to publish their spending data in a standardized, machine-readable format that the public can access through USASpending.gov.
The Digital Accountability and Transparency Act of 2014 (S. 994) amends the eight-year-old Federal Funding Accountability and Transparency Act to make available to the public specific classes of federal agency spending data “with more specificity and at a deeper level than is currently reported,” a White House statement said….
Advocacy groups applauded the bipartisan legislation, which is being heralded the nation’s first open data law and furnishes a legislative mandate for Obama’s one-year-old Open Data Policy.
“The DATA Act will unlock a new public resource that innovators, watchdogs, and citizens can mine for valuable and unprecedented insight into federal spending,” said Hudson Hollister, executive director of the Data Transparency Coalition. “America’s tech sector already has the tools to deliver reliable, standardized, open data. [The] historic victory will put our nation’s open data pioneers to work for the common good.”
The DATA Act requires agencies to establish government-wide standards for financial data, adopt accounting approaches developed by the Recovery Act’s Recovery Accountability and Transparency Board (RATB), and streamline agency reporting requirements.
The DATA Act empowers the Secretary of the Treasury to establish a data analytics center, which is modeled on the successful Recovery Operations Center. The new center will support inspectors general and law enforcement agencies in criminal and other investigations, as well as agency program offices in the prevention of improper payments. Assets of the RATB related to the Recovery Operations Center would transfer to the Treasury Department when the board’s authorization expires.
The treasury secretary and the Director of the White House’s Office of Management and Budget are jointly tasked with establishing the standards required to achieve the goals and objectives of the new statute.
To ensure that agencies comply with the reporting requirements, agency inspectors general will report on the quality and accuracy of the financial data provided to USASpending.gov. The Government Accountability Office also will report on the data quality and accuracy and create a Government-wide assessment of the financial data reported…”

Believe the hype: Big data can have a big social impact


Annika Small at the Guardian: “Given all the hype around so called big data at the moment, it would be easy to dismiss it as nothing more than the latest technology buzzword. This would be a mistake, given that the application and interpretation of huge – often publicly available – data sets is already supporting new models of creativity, innovation and engagement.
To date, stories of big data’s progress and successes have tended to come from government and the private sector, but we’ve heard little about its relevance to social organisations. Yet big data can fuel big social change.
It’s already playing a vital role in the charitable sector. Some social organisations are using existing open government data to better target their services, to improve advocacy and fundraising, and to support knowledge sharing and collaboration between different charities and agencies. Crowdsourcing of open data also offers a new way for not-for-profits to gather intelligence, and there is a wide range of freely available online tools to help them analyse the information.
However, realising the potential of big and open data presents a number of technical and organisational challenges for social organisations. Many don’t have the required skills, awareness and investment to turn big data to their advantage. They also tend to lack the access to examples that might help demystify the technicalities and focus on achievable results.
Overcoming these challenges can be surprisingly simple: Keyfund, for example, gained insight into what made for a successful application to their scheme through using a free, online tool to create word clouds out of all the text in their application forms. Many social organisations could use this same technique to better understand the large volume of unstructured text that they accumulate – in doing so, they would be “doing big data” (albeit in a small way). At the other end of the scale, Global Giving has developed its own sophisticated set of analytical tools to better understand the 57,000+ “stories” gathered from its network.
Innovation often happens when different disciplines collide and it’s becoming apparent that most value – certainly most social value – is likely to be created at the intersection of government, private and social sector data. That could be the combination of data from different sectors, or better “data collaboration” within sectors.
The Housing Association Charitable Trust (HACT) has produced two original tools that demonstrate this. Its Community Insight tool combines data from different sectors, allowing housing providers easily to match information about their stock to a large store of well-maintained open government figures. Meanwhile, its Housing Big Data programme is building a huge dataset by combining stats from 16 different housing providers across the UK. While Community Insight allows each organisation to gain better individual understanding of their communities (measuring well-being and deprivation levels, tracking changes over time, identifying hotspots of acute need), Housing Big Data is making progress towards a much richer network of understanding, providing a foundation for the sector to collaboratively identify challenges and quantify the impact of their interventions.
Alongside this specific initiative from HACT, it’s also exciting to see programmes such as 360giving, which forge connections between a range of private and social enterprises, and lays foundations for UK social investors to be a significant source of information over the next decade. Certainly, The Big Lottery Fund’s publication of open data late last year is a milestone which also highlights how far we have to travel as a sector before we are truly “data-rich”.
At Nominet Trust, we have produced the Social Tech Guide to demonstrate the scale and diversity of social value being generated internationally – much of which is achieved through harnessing the power of big data. From Knewton creating personally tailored learning programmes, to Cellslider using the power of the crowd to advance cancer research, there is no shortage of inspiration. The UN’s Global Pulse programme is another great example, with its focus on how we can combine private and public sources to pin down the size and shape of a social challenge, and calibrate our collective response.
These examples of data-driven social change demonstrate the huge opportunities for social enterprises to harness technology to generate insights, to drive more effective action and to fuel social change. If we are to realise this potential, we need to continue to stretch ourselves as social enterprises and social investors.”

Continued Progress and Plans for Open Government Data


Steve VanRoekel, and Todd Park at the White House:  “One year ago today, President Obama signed an executive order that made open and machine-readable data the new default for government information. This historic step is helping to make government-held data more accessible to the public and to entrepreneurs while appropriately safeguarding sensitive information and rigorously protecting privacy.
Freely available data from the U.S. government is an important national resource, serving as fuel for entrepreneurship, innovation, scientific discovery, and economic growth. Making information about government operations more readily available and useful is also core to the promise of a more efficient and transparent government. This initiative is a key component of the President’s Management Agenda and our efforts to ensure the government is acting as an engine to expand economic growth and opportunity for all Americans. The Administration is committed to driving further progress in this area, including by designating Open Data as one of our key Cross-Agency Priority Goals.
Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the Health, Energy, Climate, Education, Finance, Public Safety, and Global Development sectors. The White House has also launched Project Open Data, designed to share best practices, examples, and software code to assist federal agencies with opening data. These efforts have helped unlock troves of valuable data—that taxpayers have already paid for—and are making these resources more open and accessible to innovators and the public.
Other countries are also opening up their data. In June 2013, President Obama and other G7 leaders endorsed the Open Data Charter, in which the United States committed to publish a roadmap for our nation’s approach to releasing and improving government data for the public.
Building upon the Administration’s Open Data progress, and in fulfillment of the Open Data Charter, today we are excited to release the U.S. Open Data Action Plan. The plan includes a number of exciting enhancements and new data releases planned in 2014 and 2015, including:

  • Small Business Data: The Small Business Administration’s (SBA) database of small business suppliers will be enhanced so that software developers can create tools to help manufacturers more easily find qualified U.S. suppliers, ultimately reducing the transaction costs to source products and manufacture domestically.
  • Smithsonian American Art Museum Collection: The Smithsonian American Art Museum’s entire digitized collection will be opened to software developers to make educational apps and tools. Today, even museum curators do not have easily accessible information about their art collections. This information will soon be available to everyone.
  • FDA Adverse Drug Event Data: Each year, healthcare professionals and consumers submit millions of individual reports on drug safety to the Food and Drug Administration (FDA). These anonymous reports are a critical tool to support drug safety surveillance. Today, this data is only available through limited quarterly reports. But the Administration will soon be making these reports available in their entirety so that software developers can build tools to help pull potentially dangerous drugs off shelves faster than ever before.

We look forward to implementing the U.S. Open Data Action Plan, and to continuing to work with our partner countries in the G7 to take the open data movement global”.

Can Big Data Stop Wars Before They Happen?


Foreign Policy: “It has been almost two decades exactly since conflict prevention shot to the top of the peace-building agenda, as large-scale killings shifted from interstate wars to intrastate and intergroup conflicts. What could we have done to anticipate and prevent the 100 days of genocidal killing in Rwanda that began in April 1994 or the massacre of thousands of Bosnian Muslims at Srebrenica just over a year later? The international community recognized that conflict prevention could no longer be limited to diplomatic and military initiatives, but that it also requires earlier intervention to address the causes of violence between nonstate actors, including tribal, religious, economic, and resource-based tensions.
For years, even as it was pursued as doggedly as personnel and funding allowed, early intervention remained elusive, a kind of Holy Grail for peace-builders. This might finally be changing. The rise of data on social dynamics and what people think and feel — obtained through social media, SMS questionnaires, increasingly comprehensive satellite information, news-scraping apps, and more — has given the peace-building field hope of harnessing a new vision of the world. But to cash in on that hope, we first need to figure out how to understand all the numbers and charts and figures now available to us. Only then can we expect to predict and prevent events like the recent massacres in South Sudan or the ongoing violence in the Central African Republic.
A growing number of initiatives have tried to make it across the bridge between data and understanding. They’ve ranged from small nonprofit shops of a few people to massive government-funded institutions, and they’ve been moving forward in fits and starts. Few of these initiatives have been successful in documenting incidents of violence actually averted or stopped. Sometimes that’s simply because violence or absence of it isn’t verifiable. The growing literature on big data and conflict prevention today is replete with caveats about “overpromising and underdelivering” and the persistent gap between early warning and early action. In the case of the Conflict Early Warning and Response Mechanism (CEWARN) system in central Africa — one of the earlier and most prominent attempts at early intervention — it is widely accepted that the project largely failed to use the data it retrieved for effective conflict management. It relied heavily on technology to produce large databases, while lacking the personnel to effectively analyze them or take meaningful early action.
To be sure, disappointments are to be expected when breaking new ground. But they don’t have to continue forever. This pioneering work demands not just data and technology expertise. Also critical is cross-discipline collaboration between the data experts and the conflict experts, who know intimately the social, political, and geographic terrain of different locations. What was once a clash of cultures over the value and meaning of metrics when it comes to complex human dynamics needs to morph into collaboration. This is still pretty rare, but if the past decade’s innovations are any prologue, we are hopefully headed in the right direction.
* * *
Over the last three years, the U.S. Defense Department, the United Nations, and the CIA have all launched programs to parse the masses of public data now available, scraping and analyzing details from social media, blogs, market data, and myriad other sources to achieve variations of the same goal: anticipating when and where conflict might arise. The Defense Department’s Information Volume and Velocity program is designed to use “pattern recognition to detect trends in a sea of unstructured data” that would point to growing instability. The U.N.’s Global Pulse initiative’s stated goal is to track “human well-being and emerging vulnerabilities in real-time, in order to better protect populations from shocks.” The Open Source Indicators program at the CIA’s Intelligence Advanced Research Projects Activity aims to anticipate “political crises, disease outbreaks, economic instability, resource shortages, and natural disasters.” Each looks to the growing stream of public data to detect significant population-level changes.
Large institutions with deep pockets have always been at the forefront of efforts in the international security field to design systems for improving data-driven decision-making. They’ve followed the lead of large private-sector organizations where data and analytics rose to the top of the corporate agenda. (In that sector, the data revolution is promising “to transform the way many companies do business, delivering performance improvements not seen since the redesign of core processes in the 1990s,” as David Court, a director at consulting firm McKinsey, has put it.)
What really defines the recent data revolution in peace-building, however, is that it is transcending size and resource limitations. It is finding its way to small organizations operating at local levels and using knowledge and subject experts to parse information from the ground. It is transforming the way peace-builders do business, delivering data-led programs and evidence-based decision-making not seen since the field’s inception in the latter half of the 20th century.
One of the most famous recent examples is the 2013 Kenyan presidential election.
In March 2013, the world was watching and waiting to see whether the vote would produce more of the violence that had left at least 1,300 people dead and 600,000 homeless during and after 2010 elections. In the intervening years, a web of NGOs worked to set up early-warning and early-response mechanisms to defuse tribal rivalries, party passions, and rumor-mongering. Many of the projects were technology-based initiatives trying to leverage data sources in new ways — including a collaborative effort spearheaded and facilitated by a Kenyan nonprofit called Ushahidi (“witness” in Swahili) that designs open-source data collection and mapping software. The Umati (meaning “crowd”) project used an Ushahidi program to monitor media reports, tweets, and blog posts to detect rising tensions, frustration, calls to violence, and hate speech — and then sorted and categorized it all on one central platform. The information fed into election-monitoring maps built by the Ushahidi team, while mobile-phone provider Safaricom donated 50 million text messages to a local peace-building organization, Sisi ni Amani (“We are Peace”), so that it could act on the information by sending texts — which had been used to incite and fuel violence during the 2007 elections — aimed at preventing violence and quelling rumors.
The first challenges came around 10 a.m. on the opening day of voting. “Rowdy youth overpowered police at a polling station in Dandora Phase 4,” one of the informal settlements in Nairobi that had been a site of violence in 2007, wrote Neelam Verjee, programs manager at Sisi ni Amani. The young men were blocking others from voting, and “the situation was tense.”
Sisi ni Amani sent a text blast to its subscribers: “When we maintain peace, we will have joy & be happy to spend time with friends & family but violence spoils all these good things. Tudumishe amani [“Maintain the peace”] Phase 4.” Meanwhile, security officers, who had been called separately, arrived at the scene and took control of the polling station. Voting resumed with little violence. According to interviews collected by Sisi ni Amani after the vote, the message “was sent at the right time” and “helped to calm down the situation.”
In many ways, Kenya’s experience is the story of peace-building today: Data is changing the way professionals in the field think about anticipating events, planning interventions, and assessing what worked and what didn’t. But it also underscores the possibility that we might be edging closer to a time when peace-builders at every level and in all sectors — international, state, and local, governmental and not — will have mechanisms both to know about brewing violence and to save lives by acting on that knowledge.
Three important trends underlie the optimism. The first is the sheer amount of data that we’re generating. In 2012, humans plugged into digital devices managed to generate more data in a single year than over the course of world history — and that rate more than doubles every year. As of 2012, 2.4 billion people — 34 percent of the world’s population — had a direct Internet connection. The growth is most stunning in regions like the Middle East and Africa where conflict abounds; access has grown 2,634 percent and 3,607 percent, respectively, in the last decade.
The growth of mobile-phone subscriptions, which allow their owners to be part of new data sources without a direct Internet connection, is also staggering. In 2013, there were almost as many cell-phone subscriptions in the world as there were people. In Africa, there were 63 subscriptions per 100 people, and there were 105 per 100 people in the Arab states.
The second trend has to do with our expanded capacity to collect and crunch data. Not only do we have more computing power enabling us to produce enormous new data sets — such as the Global Database of Events, Language, and Tone (GDELT) project, which tracks almost 300 million conflict-relevant events reported in the media between 1979 and today — but we are also developing more-sophisticated methodological approaches to using these data as raw material for conflict prediction. New machine-learning methodologies, which use algorithms to make predictions (like a spam filter, but much, much more advanced), can provide “substantial improvements in accuracy and performance” in anticipating violent outbreaks, according to Chris Perry, a data scientist at the International Peace Institute.
This brings us to the third trend: the nature of the data itself. When it comes to conflict prevention and peace-building, progress is not simply a question of “more” data, but also different data. For the first time, digital media — user-generated content and online social networks in particular — tell us not just what is going on, but also what people think about the things that are going on. Excitement in the peace-building field centers on the possibility that we can tap into data sets to understand, and preempt, the human sentiment that underlies violent conflict.
Realizing the full potential of these three trends means figuring out how to distinguish between the information, which abounds, and the insights, which are actionable. It is a distinction that is especially hard to make because it requires cross-discipline expertise that combines the wherewithal of data scientists with that of social scientists and the knowledge of technologists with the insights of conflict experts.

How Helsinki Became the Most Successful Open-Data City in the World


Olli Sulopuisto in Atlantic Cities:  “If there’s something you’d like to know about Helsinki, someone in the city administration most likely has the answer. For more than a century, this city has funded its own statistics bureaus to keep data on the population, businesses, building permits, and most other things you can think of. Today, that information is stored and freely available on the internet by an appropriately named agency, City of Helsinki Urban Facts.
There’s a potential problem, though. Helsinki may be Finland’s capital and largest city, with 620,000 people. But it’s only one of more than a dozen municipalities in a metropolitan area of almost 1.5 million. So in terms of urban data, if you’re only looking at Helsinki, you’re missing out on more than half of the picture.
Helsinki and three of its neighboring cities are now banding together to solve that problem. Through an entity called Helsinki Region Infoshare, they are bringing together their data so that a fuller picture of the metro area can come into view.
That’s not all. At the same time these datasets are going regional, they’re also going “open.” Helsinki Region Infoshare publishes all of its data in formats that make it easy for software developers, researchers, journalists and others to analyze, combine or turn into web-based or mobile applications that citizens may find useful. In four years of operation, the project has produced more than 1,000 “machine-readable” data sources such as a map of traffic noise levels, real-time locations of snow plows, and a database of corporate taxes.
A global leader
All of this has put the Helsinki region at the forefront of the open-data movement that is sweeping cities across much of the world. The concept is that all kinds of good things can come from assembling city data, standardizing it and publishing it for free. Last month, Helsinki Region Infoshare was presented with the European Commission’s prize for innovation in public administration.

The project is creating transparency in government and a new digital commons. It’s also fueling a small industry of third-party application developers who take all this data and turn it into consumer products.
For example, Helsinki’s city council has a paperless system called Ahjo for handling its agenda items, minutes and exhibits that accompany council debates. Recently, the datasets underlying Ahjo were opened up. The city built a web-based interface for browsing the documents, but a software developer who doesn’t even live in Helsinki created a smartphone app for it. Now anyone who wants to keep up with just about any decision Helsinki’s leaders have before them can do so easily.
Another example is a product called BlindSquare, a smartphone app that helps blind people navigate the city. An app developer took the Helsinki region’s data on public transport and services, and mashed it up with location data from the social networking app Foursquare as well as mapping tools and the GPS and artificial voice capabilities of new smartphones. The product now works in dozens of countries and languages and sells for about €17 ($24 U.S.)

Helsinki also runs competitions for developers who create apps with public-sector data. That’s nothing new — BlindSquare won the Apps4Finland and European OpenCities app challenges in 2012. But this year, they’re trying a new approach to the app challenge concept, funded by the European Commission’s prize money and Sitra.
It’s called Datademo. Instead of looking for polished but perhaps random apps to heap fame and prize money on, Datademo is trying to get developers to aim their creative energies toward general goals city leaders think are important. The current competition specifies that apps have to use open data from the Helsinki region or from Finland to make it easier for citizens to find information and participate in democracy. The competition also gives developers seed funding upfront.
Datademo received more than 40 applications in its first round. Of those, the eight best suggestions were given three months and €2,000 ($2,770 U.S) to implement their ideas. The same process will be repeated two times, resulting in dozens of new app ideas that will get a total of €48,000 ($66,000 U.S.) in development subsidies. Keeping with the spirit of transparency, the voting and judging process is open to all who submit an idea for each round….”

Is Your City’s Crime Data Private Property?


Adam Wisnieski at the Crime Report: “In February, the Minneapolis Police Department (MPD) announced it was moving into a new era of transparency and openness with the launch of a new public crime map.
“Crime analysis and mapping data is now in the hands of the city’s citizens,” reads the first line of the press release.
According to the release, the MPD will feed incident report data to RAIDS (Regional Analysis and Information Data Sharing) Online, a nationwide crime map operated by crime analysis software company BAIR Analytics.
Since the announcement, Minneapolis residents have used RAIDS to look at reports of murder, robbery, burglary, assault, rape and other crimes reported in their neighborhoods on a sleek, easy-to-use map, which includes data as recent as yesterday.
On the surface, it’s a major leap forward for transparency in Minneapolis. But some question why the data feed is given exclusively to a single private company.
Transparency advocates argue in fact that the data is not truly in the hands of the city’s residents until citizens can download the raw data so they can analyze, chart or map it on their own.
“For it to actually be open data, it needs to be available to the public in machine readable format,” said Lauren Reid, senior public affairs manager for Code for America, a national non-profit that promotes participation in government through technology.
“Anybody should be able to go download it and read it if they want. That’s open data.”
The Open Knowledge Foundation, a national non-profit that advocates for more government openness, argues open data is important so citizens can participate and engage government in a way that was not possible before.
“Much of the time, citizens are only able to engage with their own governance sporadically — maybe just at an election every 4 or 5 years,” reads the Open Knowledge website. “By opening up data, citizens are enabled to be much more directly informed and involved in decision-making.
“This is more than transparency: it’s about making a full ‘read/write’ society — not just about knowing what is happening in the process of governance, but being able to contribute to it.”.
Minneapolis is not alone.
As Americans demand more information on criminal activity from the government, police departments are flocking to private companies to help them get the information into the public domain.
For many U.S. cities, hooking up with these third-party mapping vendors is the most public their police department has ever been. But the trend has started a messy debate about how “public” the public data actually is.
Outsourcing Makes It Easy
For police departments, outsourcing the presentation of their crime data to a private firm is an easy decision.
Most of the crime mapping sites are free or cost very little. (The Omega Group’s CrimeMapping.com charges between $600 and $2,400 per year, depending on the size of the agency.)
The department chooses what information it wants to provide. Once the system is set up, the data flows to the companies and then to the public without a lot of effort on behalf of the department.
For the most part, the move doesn’t need legislative approval, just a memorandum of understanding. A police department can even fulfill a new law requiring a public crime map by releasing report data through one of these vendors.
Commander Scott Gerlicher of the MPD’s Strategic Information and Crime Analysis Division says the software has saved the department time.
“I don’t think we are entertaining quite as many requests from the media or the public,” he told The Crime Report. “Plus the price was right: it was free.”
The companies that run some of the most popular sites — The Omega Group’s CrimeMapping.com, Public Engines’ CrimeReports and BAIR Analytics’ RAIDS — are in the business of selling crime analysis and mapping software to police departments.
Some departments buy internal software from these companies; though some cities, like Minneapolis, just use RAIDS’ free map and have no contracts with BAIR for internal software.
Susan Smith, director of operations at BAIR Analytics, said the goal of RAIDS is to create one national map that includes all crime reports from across all jurisdictions and departments (state and local police).
For people who live near or at the edge of a city line, finding relevant crime data can be hard.
The MPD’s Gerlicher said that was one reason his department chose RAIDS — because many police agencies in the Minneapolis area had already hooked up with the firm.
The operators of these crime maps say they provide a community service.
“We try to get as many agencies as we possibly can. We truly believe this is a good service for the community,” says Gabriela Coverdale, a marketing director at the Omega Group.
Raw Data ‘Off Limits’
However, the sites do not allow the public to download any of the raw data and prohibit anyone from “scraping,” using a program to automatically pull the data from their maps.
In Minneapolis, the police department continues to post PDFs and excel spreadsheets with data, but only RAIDS gets a feed with the most recent data.
Alan Palazzolo, a Code for America fellow who works as an interactive developer for the online non-profit newspaper MinnPost, used monthly reports from the MPD to build a crime application with a map and geographic-oriented chart of crime in Minneapolis.
Nevertheless, he finds the new tool limiting.
“[The MPD’s] ability to actually put out more data, and more timely data, really opens things up,” he said. “It’s great, but they are not doing that with us.”
According to Palazzolo, the arrangement gives BAIR a market advantage that effectively prevents its data from being used for purposes it cannot control.
“Having granular, complete, and historical data would allow us to do more in-depth analysis,” wrote Palazzolo and Kaeti Hinck in an article in MinnPost last year.
“Granular data would allow us to look at smaller areas,” reads the article. “[N]eighborhoods are a somewhat arbitrary boundary when it comes to crime. Often high crime is isolated to a couple of blocks, but aggregated data does not allow us to explore this.
“More complete data would allow us to look at factors like exact locations, time of day, demographic issues, and detailed categories (like bike theft).”
The question of preference gets even messier when looking at another national crime mapping website called SpotCrime.
Unlike the other third-party mapping sites, SpotCrime is not in the business of selling crime analysis software to police departments. It operates more like a newspaper — a newspaper focused solely on the police blotter pages — and makes money off advertising.
Years ago, SpotCrime requested and received crime report data via e-mail from the Minneapolis Police Department and mapped the data on its website. According to SpotCrime owner Colin Drane, the MPD stopped sending e-mails when terminals were set up in the police department for the public to access the data.
So he instead started going through the painstaking process of transferring data from PDFs the MPD posted online and mapping them.
When the MPD hooked up with RAIDS in February, Drane asked for the same feed and was denied. He says more and more police departments around the country are hooking up with one of his competitors and not giving him the same timely data.
The MPD said it prefers RAIDS over SpotCrime and criticized some of the advertisements on SpotCrime.
“We’re not about supporting ad money,” said Gerlicher.
Drane believes all crime data in every city should be open to everyone, in order to prevent any single firm from monopolizing how the information is presented and used.
“The onus needs to be on the public agencies,” he adds. “They need to be fair with the data and they need to be fair with the public.” he said.
Transparency advocates worry that the trend is going in the opposite direction.
Ohio’s Columbus Police Department, for example, recently discontinued its public crime statistic feed and started giving the data exclusively to RAIDS.
The Columbus Dispatch wrote that the new system had less information than the old…”

United States federal government use of crowdsourcing grows six-fold since 2011


at E Pluribus Unum: “Citizensourcing and open innovation can work in the public sector, just as crowdsourcing can in the private sector. Around the world, the use of prizes to spur innovation has been booming for years. The United States of America has been significantly scaling up its use of prizes and challenges to solving grand national challenges since January 2011, when, President Obama signed an updated version of the America COMPETES Act into law.
According to the third congressionally mandated report released by the Obama administration today (PDF/Text), the number of prizes and challenges conducted under the America COMPETES Act has increased by 50% since 2012, 85% since 2012, and nearly six-fold overall since 2011. 25 different federal agencies offered prizes under COMPETES in fiscal year 2013, with 87 prize competitions in total. The size of the prize purses has also grown as well, with 11 challenges over $100,000 in 2013. Nearly half of the prizes conducted in FY 2013 were focused on software, including applications, data visualization tools, and predictive algorithms. Challenge.gov, the award-winning online platform for crowdsourcing national challenges, now has tens of thousands of users who have participated in more than 300 public-sector prize competitions. Beyond the growth in prize numbers and amounts, Obama administration highlighted 4 trends in public-sector prize competitions:

  • New models for public engagement and community building during competitions
  • Growth software and information technology challenges, with nearly 50% of the total prizes in this category
  • More emphasis on sustainability and “creating a post-competition path to success”
  • Increased focus on identifying novel approaches to solving problems

The growth of open innovation in and by the public sector was directly enabled by Congress and the White House, working together for the common good. Congress reauthorized COMPETES in 2010 with an amendment to Section 105 of the act that added a Section 24 on “Prize Competitions,” providing all agencies with the authority to conduct prizes and challenges that only NASA and DARPA has previously enjoyed, and the White House Office of Science and Technology Policy (OSTP), which has been guiding its implementation and providing guidance on the use of challenges and prizes to promote open government.
“This progress is due to important steps that the Obama Administration has taken to make prizes a standard tool in every agency’s toolbox,” wrote Cristin Dorgelo, assistant director for grand challenges in OSTP, in a WhiteHouse.gov blog post on engaging citizen solvers with prizes:

In his September 2009 Strategy for American Innovation, President Obama called on all Federal agencies to increase their use of prizes to address some of our Nation’s most pressing challenges. Those efforts have expanded since the signing of the America COMPETES Reauthorization Act of 2010, which provided all agencies with expanded authority to pursue ambitious prizes with robust incentives.
To support these ongoing efforts, OSTP and the General Services Administration have trained over 1,200 agency staff through workshops, online resources, and an active community of practice. And NASA’s Center of Excellence for Collaborative Innovation (COECI) provides a full suite of prize implementation services, allowing agencies to experiment with these new methods before standing up their own capabilities.

Sun Microsystems co-founder Bill Joy famously once said that “No matter who you are, most of the smartest people work for someone else.” This rings true, in and outside of government. The idea of governments using prizes like this to inspire technological innovation, however, is not reliant on Web services and social media, born from the fertile mind of a Silicon Valley entrepreneur. As the introduction to the third White House prize report  notes:

“One of the most famous scientific achievements in nautical history was spurred by a grand challenge issued in the 18th Century. The issue of safe, long distance sea travel in the Age of Sail was of such great importance that the British government offered a cash award of £20,000 pounds to anyone who could invent a way of precisely determining a ship’s longitude. The Longitude Prize, enacted by the British Parliament in 1714, would be worth some £30 million pounds today, but even by that measure the value of the marine chronometer invented by British clockmaker John Harrison might be a deal.”

Centuries later, the Internet, World Wide Web, mobile devices and social media offer the best platforms in history for this kind of approach to solving grand challenges and catalyzing civic innovation, helping public officials and businesses find new ways to solve old problem. When a new idea, technology or methodology that challenges and improves upon existing processes and systems, it can improve the lives of citizens or the function of the society that they live within….”

#Bring back our girls


The Guardian: “The abduction of more than 200 schoolgirls in Nigeria has lead to campaigns calling for their rescue, on social media and offline all around the world.
After Nigerian protestors marched on parliament in the capital Abuja calling for action on April 30, people in cities around the world have followed suit and organised their own marches.
A social media campaign under the hashtag #Bringbackourgirls started trending in Nigeria two weeks ago and has now been tweeted more than one million times. It was first used on April 23 at the opening ceremony for a UNESCO event honouring the Nigerian city of Port Harcourt as the 2014 World Book Capital City. A Nigerian lawyer in Abuja, Ibrahim M. Abdullahi, tweeted the call in a speech by Dr. Oby Ezekwesili, Vice President of the World Bank for Africa to “Bring Back the Girls!”

Another mass demonstration took place outside the Nigerian Defence Headquarters in Abuja on May 6 and many other protests have been organised in response to a social media campaign asking for people around the world to march and wear red in solidarity. People came out in protest at the Nigerian embassy in London, in Los Angeles and New York.

A global “social media march” has also been organised asking supporters to use their networks to promote the campaign for 200 minutes on May 8.
A petition started on Change.org by a Nigerian woman in solidarity with the schoolgirls has now been signed by more than 300,000 supporters.
Amnesty International and UNICEF have backed the campaign, as well as world leaders and celebrities, including Hilary Clinton, Malala Yousafzai and rappers Wyclef Jean and Chris Brown, whose mention of the campaign was retweeted more than 10,000 times.

After three weeks of silence the Nigerian President Goodluck Jonathan vowed to find the schoolgirls on April 3, stating: “wherever these girls are, we’ll get them out”. On the same day, John Kerry pledged assistance from the US.”

Change: 19 Key Essays on How the Internet Is Changing Our Lives


Book by (among others) by Manuel Castells, David Gelernter, Juan Vázquez, Evgeni Morozov et al: “Change: 19 Key Essays on How the Internet Is Changing Our Lives, is the sixth issue of BBVA’s annual series devoted to explore the key issues of our time. This year, our chosen theme is the Internet, the single most powerful vector of change in recent history. In the words of Arthur C Clarke, “Any sufficiently advanced technology is indistinguishable from magic.” The swiftness and reach of the changes wrought by the Internet indeed have a touch of magic about them.
As a tool available to a reasonably wide public, the Internet is only twenty years old, but it is already the fundamental catalyst of the broadest based and fastest technological revolution in history. It is the broadest based because over the past two decades its effects have touched upon practically every citizen in the world. And it is the fastest because its mass adoption is swifter than that of any earlier technology. To put this into perspective – it was only 70 years after the invention of the aeroplane that 100 million people travelled by air; it took 50 years after the invention of the telephone for 100 million people to use this form of communication. The 100-million user mark was achieved by PCs after 14 years. The Internet made 100 million users after just 7 years. The cycles of adoption of Internet-related technologies are even shorter – Facebook acquired 100 million users in 2 years. It is impossible today to imagine the world without the Internet: it enables us to do things which only a few years ago would be unthinkable, and impinges on every sphere of our lives.”