Open Government Data Gains Global Momentum


Wyatt Kash in Information Week: “Governments across the globe are deepening their strategic commitments and working more closely to make government data openly available for public use, according to public and private sector leaders who met this week at the inaugural Open Government Data Forum in Abu Dhabi, hosted by the United Nations and the United Arab Emirates, April 28-29.

Data experts from Europe, the Middle East, the US, Canada, Korea, and the World Bank highlighted how one country after another has set into motion initiatives to expand the release of government data and broaden its use. Those efforts are gaining traction due to multinational organizations, such as the Open Government Partnership, the Open Data Institute, The World Bank, and the UN’s e-government division, that are trying to share practices and standardize open data tools.
In the latest example, the French government announced April 24 that it is joining the Open Government Partnership, a group of 64 countries working jointly to make their governments more open, accountable, and responsive to citizens. The announcement caps a string of policy shifts, which began with the formal release of France’s Open Data Strategy in May 2011 and which parallel similar moves by the US.
The strategy committed France to providing “free access and reuse of public data… using machine-readable formats and open standards,” said Romain Lacombe, head of innovation for the French prime minister’s open government task force, Etalab. The French government is taking steps to end the practice of selling datasets, such as civil and case-law data, and is making them freely reusable. France launched a public data portal, Data.gouv.fr, in December 2011 and joined a G8 initiative to engage with open data innovators worldwide.
For South Korea, open data is not just about achieving greater transparency and efficiency, but is seen as digital fuel for a nation that by 2020 expects to achieve “ambient intelligence… when all humans and things are connected together,” said Dr. YoungSun Lee, who heads South Korea’s National Information Society Agency.
He foresees open data leading to a shift in the ways government will function: from an era of e-government, where information is delivered to citizens, to one where predictive analysis will foster a “creative government,” in which “government provides customized services for each individual.”
The open data movement is also propelling innovative programs in the United Arab Emirates. “The role of open data in directing economic and social decisions pertaining to investments… is of paramount importance” to the UAE, said Dr. Ali M. Al Khouri, director general of the Emirates Identity Authority. It also plays a key role in building public trust and fighting corruption, he said….”

Findings of the Big Data and Privacy Working Group Review


John Podesta at the White House Blog: “Over the past several days, severe storms have battered Arkansas, Oklahoma, Mississippi and other states. Dozens of people have been killed and entire neighborhoods turned to rubble and debris as tornadoes have touched down across the region. Natural disasters like these present a host of challenges for first responders. How many people are affected, injured, or dead? Where can they find food, shelter, and medical attention? What critical infrastructure might have been damaged?
Drawing on open government data sources, including Census demographics and NOAA weather data, along with their own demographic databases, Esri, a geospatial technology company, has created a real-time map showing where the twisters have been spotted and how the storm systems are moving. They have also used these data to show how many people live in the affected area, and summarize potential impacts from the storms. It’s a powerful tool for emergency services and communities. And it’s driven by big data technology.
In January, President Obama asked me to lead a wide-ranging review of “big data” and privacy—to explore how these technologies are changing our economy, our government, and our society, and to consider their implications for our personal privacy. Together with Secretary of Commerce Penny Pritzker, Secretary of Energy Ernest Moniz, the President’s Science Advisor John Holdren, the President’s Economic Advisor Jeff Zients, and other senior officials, our review sought to understand what is genuinely new and different about big data and to consider how best to encourage the potential of these technologies while minimizing risks to privacy and core American values.
Over the course of 90 days, we met with academic researchers and privacy advocates, with regulators and the technology industry, with advertisers and civil rights groups. The President’s Council of Advisors for Science and Technology conducted a parallel study of the technological trends underpinning big data. The White House Office of Science and Technology Policy jointly organized three university conferences at MIT, NYU, and U.C. Berkeley. We issued a formal Request for Information seeking public comment, and hosted a survey to generate even more public input.
Today, we presented our findings to the President. We knew better than to try to answer every question about big data in three months. But we are able to draw important conclusions and make concrete recommendations for Administration attention and policy development in a few key areas.
There are a few technological trends that bear drawing out. The declining cost of collection, storage, and processing of data, combined with new sources of data like sensors, cameras, and geospatial technologies, mean that we live in a world of near-ubiquitous data collection. All this data is being crunched at a speed that is increasingly approaching real-time, meaning that big data algorithms could soon have immediate effects on decisions being made about our lives.
The big data revolution presents incredible opportunities in virtually every sector of the economy and every corner of society.
Big data is saving lives. Infections are dangerous—even deadly—for many babies born prematurely. By collecting and analyzing millions of data points from a NICU, one study was able to identify factors, like slight increases in body temperature and heart rate, that serve as early warning signs an infection may be taking root—subtle changes that even the most experienced doctors wouldn’t have noticed on their own.
Big data is making the economy work better. Jet engines and delivery trucks now come outfitted with sensors that continuously monitor hundreds of data points and send automatic alerts when maintenance is needed. Utility companies are starting to use big data to predict periods of peak electric demand, adjusting the grid to be more efficient and potentially averting brown-outs.
Big data is making government work better and saving taxpayer dollars. The Centers for Medicare and Medicaid Services have begun using predictive analytics—a big data technique—to flag likely instances of reimbursement fraud before claims are paid. The Fraud Prevention System helps identify the highest-risk health care providers for waste, fraud, and abuse in real time and has already stopped, prevented, or identified $115 million in fraudulent payments.
But big data raises serious questions, too, about how we protect our privacy and other values in a world where data collection is increasingly ubiquitous and where analysis is conducted at speeds approaching real time. In particular, our review raised the question of whether the “notice and consent” framework, in which a user grants permission for a service to collect and use information about them, still allows us to meaningfully control our privacy as data about us is increasingly used and reused in ways that could not have been anticipated when it was collected.
Big data raises other concerns, as well. One significant finding of our review was the potential for big data analytics to lead to discriminatory outcomes and to circumvent longstanding civil rights protections in housing, employment, credit, and the consumer marketplace.
No matter how quickly technology advances, it remains within our power to ensure that we both encourage innovation and protect our values through law, policy, and the practices we encourage in the public and private sector. To that end, we make six actionable policy recommendations in our report to the President:
Advance the Consumer Privacy Bill of Rights. Consumers deserve clear, understandable, reasonable standards for how their personal information is used in the big data era. We recommend the Department of Commerce take appropriate consultative steps to seek stakeholder and public comment on what changes, if any, are needed to the Consumer Privacy Bill of Rights, first proposed by the President in 2012, and to prepare draft legislative text for consideration by stakeholders and submission by the President to Congress.
Pass National Data Breach Legislation. Big data technologies make it possible to store significantly more data, and further derive intimate insights into a person’s character, habits, preferences, and activities. That makes the potential impacts of data breaches at businesses or other organizations even more serious. A patchwork of state laws currently governs requirements for reporting data breaches. Congress should pass legislation that provides for a single national data breach standard, along the lines of the Administration’s 2011 Cybersecurity legislative proposal.
Extend Privacy Protections to non-U.S. Persons. Privacy is a worldwide value that should be reflected in how the federal government handles personally identifiable information about non-U.S. citizens. The Office of Management and Budget should work with departments and agencies to apply the Privacy Act of 1974 to non-U.S. persons where practicable, or to establish alternative privacy policies that apply appropriate and meaningful protections to personal information regardless of a person’s nationality.
Ensure Data Collected on Students in School is used for Educational Purposes. Big data and other technological innovations, including new online course platforms that provide students real time feedback, promise to transform education by personalizing learning. At the same time, the federal government must ensure educational data linked to individual students gathered in school is used for educational purposes, and protect students against their data being shared or used inappropriately.
Expand Technical Expertise to Stop Discrimination. The detailed personal profiles held about many consumers, combined with automated, algorithm-driven decision-making, could lead—intentionally or inadvertently—to discriminatory outcomes, or what some are already calling “digital redlining.” The federal government’s lead civil rights and consumer protection agencies should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law.
Amend the Electronic Communications Privacy Act. The laws that govern protections afforded to our communications were written before email, the internet, and cloud computing came into wide use. Congress should amend ECPA to ensure the standard of protection for online, digital content is consistent with that afforded in the physical world—including by removing archaic distinctions between email left unread or over a certain age.
We also identify several broader areas ripe for further study, debate, and public engagement that, collectively, we hope will spark a national conversation about how to harness big data for the public good. We conclude that we must find a way to preserve our privacy values in both the domestic and international marketplace. We urgently need to build capacity in the federal government to identify and prevent new modes of discrimination that could be enabled by big data. We must ensure that law enforcement agencies using big data technologies do so responsibly, and that our fundamental privacy rights remain protected. Finally, we recognize that data is a valuable public resource, and call for continuing the Administration’s efforts to open more government data sources and make investments in research and technology.
While big data presents new challenges, it also presents immense opportunities to improve lives, the United States is perhaps better suited to lead this conversation than any other nation on earth. Our innovative spirit, technological know-how, and deep commitment to values of privacy, fairness, non-discrimination, and self-determination will help us harness the benefits of the big data revolution and encourage the free flow of information while working with our international partners to protect personal privacy. This review is but one piece of that effort, and we hope it spurs a conversation about big data across the country and around the world.
Read the Big Data Report.
See the fact sheet from today’s announcement.

Looking for the Needle in a Stack of Needles: Tracking Shadow Economic Activities in the Age of Big Data


Manju Bansal in MIT Technology Review: “The undocumented guys hanging out in the home-improvement-store parking lot looking for day labor, the neighborhood kids running a lemonade stand, and Al Qaeda terrorists plotting to do harm all have one thing in common: They operate in the underground economy, a shadowy zone where businesses, both legitimate and less so, transact in the currency of opportunity, away from traditional institutions and their watchful eyes.
One might think that this alternative economy is limited to markets that are low on the Transparency International rankings (such as sub-Saharan Africa and South Asia, for instance). However, a recent University of Wisconsin report estimates the value of the underground economy in the United States at about $2 trillion, about 15% of the total U.S. GDP. And a 2013 study coauthored by Friedrich Schneider, a noted authority on global shadow economies, estimated the European Union’s underground economy at more than 18% of GDP, or a whopping 2.1 trillion euros. More than two-thirds of the underground activity came from the most developed countries, including Germany, France, Italy, Spain, and the United Kingdom.
Underground economic activity is a multifaceted phenomenon, with implications across the board for national security, tax collections, public-sector services, and more. It includes the activity of any business that relies primarily on old-fashioned cash for most transactions — ranging from legitimate businesses (including lemonade stands) to drug cartels and organized crime.
Though it’s often soiled, heavy to lug around, and easy to lose to theft, cash is still king simply because it is so easy to hide from the authorities. With the help of the right bank or financial institution, “dirty” money can easily be laundered and come out looking fresh and clean, or at least legitimate. Case in point is the global bank HSBC, which agreed to pay U.S. regulators $1.9 billion in fines to settle charges of money laundering on behalf of Mexican drug cartels. According to a U.S. Senate subcommittee report, that process involved transferring $7 billion in cash from the bank’s branches in Mexico to those in the United States. Just for reference, each $100 bill weighs one gram, so to transfer $7 billion, HSBC had to physically transport 70 metric tons of cash across the U.S.-Mexican border.
The Financial Action Task Force, an intergovernmental body established in 1989, has estimated the total amount of money laundered worldwide to be around 2% to 5% of global GDP. Many of these transactions seem, at first glance, to be perfectly legitimate. Therein lies the conundrum for a banker or a government official: How do you identify, track, control, and, one hopes, prosecute money launderers, when they are hiding in plain sight and their business is couched in networked layers of perfectly defensible legitimacy?
Enter big-data tools, such as those provided by SynerScope, a Holland-based startup that is a member of the SAP Startup Focus program. This company’s solutions help unravel the complex networks hidden behind the layers of transactions and interactions.
Networks, good or bad, are near omnipresent in almost any form of organized human activity and particularly in banking and insurance. SynerScope takes data from both structured and unstructured data fields and transforms these into interactive computer visuals that display graphic patterns that humans can use to quickly make sense of information. Spotting of deviations in complex networked processes can easily be put to use in fraud detection for insurance, banking, e-commerce, and forensic accounting.
SynerScope’s approach to big-data business intelligence is centered on data-intense compute and visualization that extend the human “sense-making” capacity in much the same way that a telescope or microscope extends human vision.
To understand how SynerScope helps authorities track and halt money laundering, it’s important to understand how the networked laundering process works. It typically involves three stages.
1. In the initial, or placement, stage, launderers introduce their illegal profits into the financial system. This might be done by breaking up large amounts of cash into less-conspicuous smaller sums that are then deposited directly into a bank account, or by purchasing a series of monetary instruments (checks, money orders) that are then collected and deposited into accounts at other locations.
2. After the funds have entered the financial system, the launderer commences the second stage, called layering, which uses a series of conversions or transfers to distance the funds from their sources. The funds might be channeled through the purchase and sales of investment instruments, or the launderer might simply wire the funds through a series of accounts at various banks worldwide. 
Such use of widely scattered accounts for laundering is especially prevalent in those jurisdictions that do not cooperate in anti-money-laundering investigations. Sometimes the launderer disguises the transfers as payments for goods or services.
3. Having successfully processed the criminal profits through the first two phases, the launderer then proceeds to the third stage, integration, in which the funds re-enter the legitimate economy. The launderer might invest the funds in real estate, luxury assets, or business ventures.
Current detection tools compare individual transactions against preset profiles and rules. Sophisticated criminals quickly learn how to make their illicit transactions look normal for such systems. As a result, rules and profiles need constant and costly updating.
But SynerScope’s flexible visual analysis uses a network angle to detect money laundering. It shows the structure of the entire network with data coming in from millions of transactions, a structure that launderers cannot control. With just a few mouse clicks, SynerScope’s relation and sequence views reveal structural interrelationships and interdependencies. When those patterns are mapped on a time scale, it becomes virtually impossible to hide abnormal flows.

SynerScope’s relation and sequence views reveal structural and temporal transaction patterns which make it virtually impossible to hide abnormal money flows.”

Feds see innovation decline within government


Federal Times: “Support for innovation is declining across the government, according to a report by the Partnership for Public Service released April 23.
Federal employee answers to three innovation-related questions on the annual Federal Employee Viewpoint Survey dropped from 61.5 out of 100 in 2012 to 59.4 out of 100, according to the report, produced in partnership with Deloitte.

This chart, extracted from the Partnership for Public Service report, shows the slow but steady decline of innovation measures.

This chart, extracted from the Partnership for Public Service report, shows the slow but steady decline of innovation measures. (Partnership for Public Service)

While 90 percent of employees surveyed report they are always looking for better ways to do their jobs only 54.7 percent feel encouraged to do so and only 33.4 percent believe their agency rewards creativity and innovation.
“The bottom line is that federal workers are motivated to improve the way they do their work, but they do not feel supported by their organizations,” the report said.
Dave Dye, a director of human capital at Deloitte, LLP, said the report is a message to agency leaders to pay attention and have discussions on innovation and make concerted efforts to enhance innovation in their areas.
“It’s not that leaders have to be innovative in their own right it means they need to set up environments for people to feel that innovation Is encouraged, rewarded and respected,” Dye said.
Most agencies saw a decline in their “innovation score” according to the report, including:
■ The Army saw one of the largest drops in its innovation score – from 64.2 out of 100 I 2012 to 60.1 out of 100 in 2013.
■ NASA – which had the highest score at 76.0 out of 100 in 2013 – also dropped from 76.5 in 2012.
■ The Financial Crimes Enforcement Network at the Treasury Department saw one of the largest drops among component agencies, from 63.8 out of 100 in 2012 to 52.0 in 2013.
Some agencies that have shown improvement are the National Science Foundation and the Peace Corps. Some NASA facilities also saw improvement, including the John C. Stennis Space Center in Mississippi and the George C. Marshall Space Flight Center in Alabama.
The ongoing effects of sequestration, budget cuts and threat of furloughs may also have had a dampening effect on federal employees, Dye said.
“When people feel safer or more sure about whats going on they are going to better focus on the mission,” he said.
Agency managers should also work to improve their work environments to build trust and confidence in their workforce by showing concerns about people’s careers and supporting development opportunities while recognizing good work, according to Dye.
The report recommends that agencies recognize employees at team meetings or with more formal awards to highlight innovation and creativity and reward success. Managers should make sure to share specific goals, provide a forum for open discussion and work to build trust among the workforce that is needed to help spur innovation.”

Can Government Play Moneyball?


David Bornstein in the New York Times: “…For all the attention it’s getting inside the administration, evidence-based policy-making seems unlikely to become a headline grabber; it lacks emotional appeal. But it does have intellectual heft. And one group that has been doing creative work to give the message broader appeal is Results for America, which has produced useful teaching aids under the banner “Moneyball for Government,” building on the popularity of the book and movie about Billy Beane’s Oakland A’s, and the rise of data-driven decision making in major league baseball. (Watch their video explainers here and here.)
Results for America works closely with leaders across political parties and social sectors, to build awareness about evidence-based policy making — drawing attention to key areas where government could dramatically improve people’s lives by augmenting well-tested models. They are also chronicling efforts by local governments around the country, to show how an emerging group of “Geek Cities,” including Baltimore, Denver, Miami, New York, Providence and San Antonio, are using data and evidence to drive improvements in various areas of social policy like education, youth development and employment.
“It seems like common sense to use evidence about what works to get better results,” said Michele Jolin, Results for America’s managing partner. “How could anyone be against it? But the way our system is set up, there are so many loud voices pushing to have dollars spent and policy shaped in the way that works for them. There has been no organized constituency for things that work.”
“The debate in Washington is usually about the quantity of resources,” said David Medina, a partner in Results for America. “We’re trying to bring it back to talking about quality.”
Not everyone will find this change appealing. “When you have a longstanding social service policy, there’s going to be a network of [people and groups] who are organized to keep that money flowing regardless of whether evidence suggests it’s warranted,” said Daniel Stid. “People in social services don’t like to think they’re behaving like other organized interests — like dairy farmers or mortgage brokers — but it leads to tremendous inertia in public policy.”
Beyond the politics, there are practical obstacles to overcome, too. Federal agencies lack sufficient budgets for evaluation or a common definition for what constitutes rigorous evidence. (Any lobbyist can walk into a legislator’s office and claim to have solid data to support an argument.) Up-to-date evidence also needs to be packaged in accessible ways and made available on a timely basis, so it can be used to improve programs, rather than to threaten them. Governments need to build regular evaluations into everything they do — not just conduct big, expensive studies every 10 years or so.
That means developing new ways to conduct quick and inexpensive randomized studies using data that is readily available, said Haskins, who is investigating this approach. “We should be running 10,000 evaluations a year, like they do in medicine.” That’s the only way to produce the rapid trial-and-error learning needed to drive iterative program improvements, he added. (I reported on a similar effort being undertaken by the Coalition for Evidence-Based Policy.)
Results for America has developed a scorecard to rank federal departments about how prepared they are to produce or incorporate evidence in their programs. It looks at whether a department has an office and a leader with the authority and budget to evaluate its programs. It asks: Does it make its data accessible to the public? Does it compile standards about what works and share them widely? Does it spend at least 1 percent of its budget evaluating its programs? And — most important — does it incorporate evidence in its big grant programs? For now, the Department of Education gets the top score.
The stakes are high. In 2011, for example, the Obama administration launched a process to reform Head Start, doing things like spreading best practices and forcing the worst programs to improve or lose their funding. This February, for the third time, the government released a list of Head Start providers (103 out of about 1,600) who will have to recompete for federal funding because of performance problems. That list represents tens of thousands of preschoolers, many of whom are missing out on the education they need to succeed in kindergarten — and life.
Improving flagship programs like Head Start, and others, is not just vital for the families they serve; it’s vital to restore trust in government. “I am a card-carrying member of the Republican Party and I want us to be governed well,” said Robert Shea, who pushed for better program evaluations as associate director of the Office of Management and Budget during the Bush administration, and continues to focus on this issue as chairman of the National Academy of Public Administration. “This is the most promising thing I know of to get us closer to that goal.”
“This idea has the prospect of uniting Democrats and Republicans,” said Haskins. “But it will involve a broad cultural change. It has to get down to the program administrators, board members and local staff throughout the country — so they know that evaluation is crucial to their operations.”
“There’s a deep mistrust of government and a belief that problems can’t be solved,” said Michele Jolin. “This movement will lead to better outcomes — and it will help people regain confidence in their public officials by creating a more effective, more credible way for policy choices to be made.”

Historic release of data delivers unprecedented transparency on the medical services physicians provide and how much they are paid


Jonathan Blum, Principal Deputy Administrator, Centers for Medicare & Medicaid Services : “Today the Centers for Medicare & Medicaid Services (CMS) took a major step forward in making Medicare data more transparent and accessible, while maintaining the privacy of beneficiaries, by announcing the release of new data on medical services and procedures furnished to Medicare fee-for-service beneficiaries by physicians and other healthcare professionals (http://www.cms.gov/newsroom/newsroom-center.html). For too long, the only information on physicians readily available to consumers was physician name, address and phone number. This data will, for the first time, provide a better picture of how physicians practice in the Medicare program.
This new data set includes over nine million rows of data on more than 880,000 physicians and other healthcare professionals in all 50 states, DC and Puerto Rico providing care to Medicare beneficiaries in 2012. The data set presents key information on the provision of services by physicians and how much they are paid for those services, and is organized by provider (National Provider Identifier or NPI), type of service (Healthcare Common Procedure Coding System, or HCPCS) code, and whether the service was performed in a facility or office setting. This public data set includes the number of services, average submitted charges, average allowed amount, average Medicare payment, and a count of unique beneficiaries treated. CMS takes beneficiary privacy very seriously and we will protect patient-identifiable information by redacting any data in cases where it includes fewer than 11 beneficiaries.
Previously, CMS could not release this information due to a permanent injunction issued by a court in 1979. However, in May 2013, the court vacated this injunction, causing a series of events that has led CMS to be able to make this information available for the first time.
Data to Fuel Research and Innovation
In addition to the public data release, CMS is making slight modifications to the process to request CMS data for research purposes. This will allow researchers to conduct important research at the physician level. As with the public release of information described above, CMS will continue to prohibit the release of patient-identifiable information. For more information about CMS’s disclosures to researchers, please contact the Research Data Assistance Center (ResDAC) at http://www.resdac.org/.
Unprecedented Data Access
This data release follows other CMS efforts to make more data available to the public. Since 2010, the agency has released an unprecedented amount of aggregated data in machine-readable form, with much of it available at http://www.healthdata.gov. These data range from previously unpublished statistics on Medicare spending, utilization, and quality at the state, hospital referral region, and county level, to detailed information on the quality performance of hospitals, nursing homes, and other providers.
In May 2013, CMS released information on the average charges for the 100 most common inpatient services at more than 3,000 hospitals nationwide http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/Inpatient.html.
In June 2013, CMS released average charges for 30 selected outpatient procedures http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/Outpatient.html.
We will continue to work toward harnessing the power of data to promote quality and value, and improve the health of our seniors and persons with disabilities.”

Book Review: 'The Rule of Nobody' by Philip K. Howard


Stuart Taylor Jr in the Wall Street Journal: “Amid the liberal-conservative ideological clash that paralyzes our government, it’s always refreshing to encounter the views of Philip K. Howard, whose ideology is common sense spiked with a sense of urgency. In “The Rule of Nobody,” Mr. Howard shows how federal, state and local laws and regulations have programmed officials of both parties to follow rules so detailed, rigid and, often, obsolete as to leave little room for human judgment. He argues passionately that we will never solve our social problems until we abandon what he calls a misguided legal philosophy of seeking to put government on regulatory autopilot. He also predicts that our legal-governmental structure is “headed toward a stall and then a frightening plummet toward insolvency and political chaos.”
Mr. Howard, a big-firm lawyer who heads the nonpartisan government-reform coalition Common Good, is no conventional deregulator. But he warns that the “cumulative complexity” of the dense rulebooks that prescribe “every nuance of how law is implemented” leaves good officials without the freedom to do what makes sense on the ground. Stripped of the authority that they should have, he adds, officials have little accountability for bad results. More broadly, he argues that the very structure of our democracy is so clogged by deep thickets of dysfunctional law that it will only get worse unless conservatives and liberals alike cast off their distrust of human discretion.
The rulebooks should be “radically simplified,” Mr. Howard says, on matters ranging from enforcing school discipline to protecting nursing-home residents, from operating safe soup kitchens to building the nation’s infrastructure: Projects now often require multi-year, 5,000-page environmental impact statements before anything can begin to be constructed. Unduly detailed rules should be replaced by general principles, he says, that take their meaning from society’s norms and values and embrace the need for official discretion and responsibility.
Mr. Howard serves up a rich menu of anecdotes, including both the small-scale activities of a neighborhood and the vast administrative structures that govern national life. After a tree fell into a stream and caused flooding during a winter storm, Franklin Township, N.J., was barred from pulling the tree out until it had spent 12 days and $12,000 for the permits and engineering work that a state environmental rule required for altering any natural condition in a “C-1 stream.” The “Volcker Rule,” designed to prevent banks from using federally insured deposits to speculate in securities, was shaped by five federal agencies and countless banking lobbyists into 963 “almost unintelligible” pages. In New York City, “disciplining a student potentially requires 66 separate steps, including several levels of potential appeals”; meanwhile, civil-service rules make it virtually impossible to terminate thousands of incompetent employees. Children’s lemonade stands in several states have been closed down for lack of a vendor’s license.

 

Conservatives as well as liberals like detailed rules—complete with tedious forms, endless studies and wasteful legal hearings—because they don’t trust each other with discretion. Corporations like them because they provide not only certainty but also “a barrier to entry for potential competitors,” by raising the cost of doing business to prohibitive levels for small businesses with fresh ideas and other new entrants to markets. Public employees like them because detailed rules “absolve them of responsibility.” And, adds Mr. Howard, “lawsuits [have] exploded in this rules-based regime,” shifting legal power to “self-interested plaintiffs’ lawyers,” who have learned that they “could sue for the moon and extract settlements even in cases (as with some asbestos claims) that were fraudulent.”
So habituated have we become to such stuff, Mr. Howard says, that government’s “self-inflicted ineptitude is accepted as a state of nature, as if spending an average of eight years on environmental reviews—which should be a national scandal—were an unavoidable mountain range.” Common-sensical laws would place outer boundaries on acceptable conduct based on reasonable norms that are “far better at preventing abuse of power than today’s regulatory minefield.”
“As Mr. Howard notes, his book is part of a centuries-old rules-versus-principles debate. The philosophers and writers whom he quotes approvingly include Aristotle, James Madison, Isaiah Berlin and Roscoe Pound, a prominent Harvard law professor and dean who condemned “mechanical jurisprudence” and championed broad official discretion. Berlin, for his part, warned against “monstrous bureaucratic machines, built in accordance with the rules that ignore the teeming variety of the living world, the untidy and asymmetrical inner lives of men, and crush them into conformity.” Mr. Howard juxtaposes today’s roughly 100 million words of federal law and regulations with Madison’s warning that laws should not be “so voluminous that they cannot be read, or so incoherent that they cannot be understood.”…

Facebook’s Connectivity Lab will develop advanced technology to provide internet across the world


and at GigaOm: “The Internet.org initiative will rely on a new team at Facebook called the Connectivity Lab, based at the company’s Menlo Park campus, to develop technology on the ground, in the air and in space, CEO Mark Zuckerberg announced Thursday. The team will develop technology like drones and satellites to expand access to the internet across the world.
“The team’s approach is based on the principle that different sized communities need different solutions and they are already working on new delivery platforms—including planes and satellites—to provide connectivity for communities with different population densities,” a post on Internet.org says.
Internet.org, which is backed by companies like Facebook, Samsung and Qualcomm, wants to provide internet to the two thirds of the world that remains disconnected due to cost, lack of infrastructure or remoteness. While many companies are  developing business models and partnerships in areas that lack internet, the Connectivity Lab will focus on sustainable technology that will transmit the signals. Facebook envisions using drones that could fly for months to connect suburban areas, while more rural areas would rely on satellites. Both would use infrared lasers to blanket whole areas with connectivity.
Members of the Connectivity Lab have backgrounds at NASA’s Jet Propulsion Laboratory, NASA’s Ames Research Center and the National Optical Astronomy Observatory. Facebook also confirmed today that it acquired five employees from Ascenta, a U.K.-based company that worked on the Zephyr–a solar-powered drone capable of flying for two weeks straight.
The lab’s work will build on work the company has already done in the Philippines and Paraguay, Zuckerberg said in a Facebook post. And, like the company’s Open Compute project, there is a possibility that the lab will seek partnerships with outside countries once the bulk of the technology has been developed.”

Public interest labs to test open governance solutions


Kathleen Hickey in GCN: “The Governance Lab at New York University (GovLab) and the MacArthur Foundation Research Network have formed a new network, Open Governance, to study how to enhance collaboration and decision-making in the public interest.
The MacArthur Foundation provided a three-year grant of $5 million for the project; Google’s philanthropic arm, Google.org, also contributed. Google.org’s technology will be used to develop platforms to solve problems more openly and to run agile, real-world experiments with governments and NGOs to discover ways to enhance decision-making in the public interest, according to the GovLab announcement.
Network members include 12 experts in computer science, political science, policy informatics, social psychology and philosophy, law, and communications. This group is supported by an advisory network of academics, technologists, and current and former government officials. The network will assess existing government programs and experiment with ways to improve decision-making at the local, national and international government levels.
The Network’s efforts focus on three areas that members say have the potential to make governance more effective and legitimate: getting expertise in, pushing data out and distributing responsibility.
Through smarter governance, they say, institutions can seek input from lay and expert citizens via expert networking, crowdsourcing or challenges.  With open data governance, institutions can publish machine-readable data so that citizens can easily analyze and use this information to detect and solve problems. And by shared governance, institutions can help citizens develop solutions through participatory budgeting, peer production or digital commons.
“Recognizing that we cannot solve today’s challenges with yesterday’s tools, this interdisciplinary group will bring fresh thinking to questions about how our governing institutions operate and how they can develop better ways to help address seemingly intractable social problems for the common good,” said MacArthur Foundation President Robert Gallucci.
GovLab’s mission is to study and launch “experimental, technology-enabled solutions that advance a collaborative, networked approach to re-invent existing institutions and processes of governance to improve people’s lives.” Earlier this year GovLab released a preview of its Open Data 500 study of 500 companies using open government data as a key business resource.”

Open Data: What Is It and Why Should You Care?


Jason Shueh at Government Technology: “Though the debate about open data in government is an evolving one, it is indisputably here to stay — it can be heard in both houses of Congress, in state legislatures, and in city halls around the nation.
Already, 39 states and 46 localities provide data sets to data.gov, the federal government’s online open data repository. And 30 jurisdictions, including the federal government, have taken the additional step of institutionalizing their practices in formal open data policies.
Though the term “open data” is spoken of frequently — and has been since President Obama took office in 2009 — what it is and why it’s important isn’t always clear. That’s understandable, perhaps, given that open data lacks a unified definition.
“People tend to conflate it with big data,” said Emily Shaw, the national policy manager at the Sunlight Foundation, “and I think it’s useful to think about how it’s different from big data in the sense that open data is the idea that public information should be accessible to the public online.”
Shaw said the foundation, a Washington, D.C., non-profit advocacy group promoting open and transparent government, believes the term open data can be applied to a variety of information created or collected by public entities. Among the benefits of open data are improved measurement of policies, better government efficiency, deeper analytical insights, greater citizen participation, and a boost to local companies by way of products and services that use government data (think civic apps and software programs).
“The way I personally think of open data,” Shaw said, “is that it is a manifestation of the idea of open government.”

What Makes Data Open

For governments hoping to adopt open data in policy and in practice, simply making data available to the public isn’t enough to make that data useful. Open data, though straightforward in principle, requires a specific approach based on the agency or organization releasing it, the kind of data being released and, perhaps most importantly, its targeted audience.
According to the foundation’s California Open Data Handbook, published in collaboration with Stewards of Change Institute, a national group supporting innovation in human services, data must first be both “technically open” and “legally open.” The guide defines the terms in this way:
Technically open: [data] available in a machine-readable standard format, which means it can be retrieved and meaningfully processed by a computer application
Legally open: [data] explicitly licensed in a way that permits commercial and non-commercial use and re-use without restrictions.
Technically open means that data is easily accessible to its intended audience. If the intended users are developers and programmers, Shaw said, the data should be presented within an application programming interface (API); if it’s intended for researchers in academia, data might be structured in a bulk download; and if it’s aimed at the average citizen, data should be available without requiring software purchases.
….

4 Steps to Open Data

Creating open data isn’t without its complexities. There are many tasks that need to happen before an open data project ever begins. A full endorsement from leadership is paramount. Adding the project into the work flow is another. And allaying fears and misunderstandings is expected with any government project.
After the basic table stakes are placed, the handbook prescribes four steps: choosing a set of data, attaching an open license, making it available through a proper format and ensuring the data is discoverable.
1. Choose a Data Set
Choosing a data set can appear daunting, but it doesn’t have to be. Shaw said ample resources are available from the foundation and others on how to get started with this — see our list of open data resources for more information. In the case of selecting a data set, or sets, she referred to the foundation’s recently updated guidelines that urge identifying data sets based on goals and the demand from citizen feedback.
2. Attach an Open License
Open licenses dispel ambiguity and encourage use. However, they need to be proactive, and this means users should not be forced to request the information in order to use it — a common symptom of data accessed through the Freedom of Information Act. Tips for reference can be found at Opendefinition.org, a site that has a list of examples and links to open licenses that meet the definition of open use.
3. Format the Data to Your Audience
As previously stated, Shaw recommends tailoring the format of data to the audience, with the ideal being that data is packaged in formats that can be digested by all users: developers, civic hackers, department staff, researchers and citizens. This could mean it’s put into APIs, spreadsheet docs, text and zip files, FTP servers and torrent networking systems (a way to download files from different sources). The file type and the system for download all depends on the audience.
“Part of learning about what formats government should offer data in is to engage with the prospective users,” Shaw said.
4. Make it Discoverable
If open data is strewn across multiple download links and wedged into various nooks and crannies of a website, it probably won’t be found. Shaw recommends a centralized hub that acts as a one-stop shop for all open data downloads. In many jurisdictions, these Web pages and websites have been called “portals;” they are the online repositories for a jurisdiction’s open data publishing.
“It is important for thinking about how people can become aware of what their governments hold. If the government doesn’t make it easy for people to know what kinds of data is publicly available on the website, it doesn’t matter what format it’s in,” Shaw said. She pointed to public participation — a recurring theme in open data development — to incorporate into the process to improve accessibility.
 
Examples of portals, can be found in numerous cities across the U.S., such as San Francisco, New York, Los Angeles, Chicago and Sacramento, Calif.
Visit page 2 of our story for open data resources, and page 3 for open data file formats.