An Air-Quality Monitor You Take with You


MIT Technology Review: “A startup is building a wearable air-quality monitor using a sensing technology that can cheaply detect the presence of chemicals around you in real time. By reporting the information its sensors gather to an app on your smartphone, the technology could help people with respiratory conditions and those who live in highly polluted areas keep tabs on exposure.
Berkeley, California-based Chemisense also plans to crowdsource data from users to show places around town where certain compounds are identified.
Initially, the company plans to sell a $150 wristband geared toward kids with asthma—of which there are nearly 7 million in the U.S., according to data from the Centers for Disease Control and Prevention— to help them identify places and pollutants that tend to provoke attacks,  and track their exposure to air pollution over time. The company hopes people with other respiratory conditions, and those who are just concerned about air pollution, will be interested, too.
In the U.S., air quality is monitored at thousands of stations across the country; maps and forecasts can be viewed online. But these monitors offer accurate readings only in their location.
Chemisense has not yet made its initial product, but it expects it will be a wristband using polymers treated with charged nanoparticles of carbon such that the polymers swell in the presence of certain chemical vapors, changing the resistance of a circuit.”

Opening Health Data: What Do Researchers Want? Early Experiences With New York's Open Health Data Platform.


Paper by Martin, Erika G. PhD, MPH; Helbig, Natalie PhD, MPA; and Birkhead, Guthrie S. MD, MPH in the Journal of Public Health Management & Practice: “Governments are rapidly developing open data platforms to improve transparency and make information more accessible. New York is a leader, with currently the only state platform devoted to health. Although these platforms could build public health departments’ capabilities to serve more researchers, agencies have little guidance on releasing meaningful and usable data.

Objective: Structured focus groups with researchers and practitioners collected stakeholder feedback on potential uses of open health data and New York’s open data strategy….

Results: There was low awareness of open data, with 67% of researchers reporting never using open data portals prior to the workshop. Participants were interested in data sets that were geocoded, longitudinal, or aggregated to small area granularity and capabilities to link multiple data sets. Multiple environmental conditions and barriers hinder their capacity to use health data for research. Although open data platforms cannot address all barriers, they provide multiple opportunities for public health research and practice, and participants were overall positive about the state’s efforts to release open data.

Conclusions: Open data are not ideal for some researchers because they do not contain individually identifiable data, indicating a need for tiered data release strategies. However, they do provide important new opportunities to facilitate research and foster collaborations among agencies, researchers, and practitioners.”

Delivering a Customer-Focused Government Through Smarter IT


Beth Cobert, Steve VanRoekel, and Todd Park at the White House: “As technology changes, government must change with it to address new challenges and take advantage of new opportunities. This Administration has made important strides in modernizing government so that it serves its constituents more effectively and efficiently, but we know there is much more to do.
Last year, a group of digital and technology experts from the private sector helped us fix HealthCare.gov – a turnaround that enabled millions of Americans to sign up for quality health insurance. This effort also reminded us why the President’s commitment to bringing more of the nation’s top information technology (IT) talent into government is so critical to delivering the best possible results for our customers – the American people.
A core part of the President’s Management Agenda is improving the value we deliver to citizens through Federal IT. That’s why, today, the Administration is formally launching the U.S. Digital Service. The Digital Service will be a small team made up of our country’s brightest digital talent that will work with agencies to remove barriers to exceptional service delivery and help remake the digital experience that people and businesses have with their government…
The Digital Service will also collaborate closely with 18F, an exciting new unit of the U.S. General Services Administration (GSA). GSA’s 18F houses a growing group of talented developers and digital professionals who are designing and building the actual digital platforms and providing services across the government….
Leveraging Best Practices with the Digital Services Playbook
To help the Digital Service achieve its mission, today the Administration is releasing the initial version of a Digital Services Playbook that lays out best practices for building effective digital services like web and mobile applications and will serve as a guide for agencies across government. To increase the success of government digital service projects, this playbook outlines 13 key “plays” drawn from private and public-sector best practices that, if followed together, will help federal agencies deliver services that work well for users and require less time and money to develop and operate.
The technologies used to create digital services are changing rapidly. The Playbook is designed to encourage the government to adopt the best of these advances into our own work. To further strengthen this important tool, we encourage folks across the public and private sectors to provide feedback on the Playbook, so we can strengthen this important tool.
Using Agile Processes to Procure Digital Services with the TechFAR Handbook
To ensure government has the right tech tools to do its job, the Administration is also today launching the TechFAR Handbook, a guide that explains how agencies can execute key plays in the Playbook in ways consistent with the Federal Acquisition Regulation (FAR), which governs how the government must buy services from the private sector….
Stay informed — sign up here to monitor the latest news from the U.S. Digital Service.

How you can help build a more agile government


at GovFresh: “Earlier this year, I began doing research work with CivicActions on agile development in government — who was doing it, how and what the needs were to successfully get it deployed.
After the Healthcare.gov launch mishaps, calls for agile practices as the panacea to all of government IT woes reached a high. While agile as the ultimate solution oversimplifies the issue, we’ve evolved as a profession (both software development and public service) that moving towards an iterative approach to operations is the way of the future.
My own formal introduction with agile began with my work with CivicActions, so the research coincided with an introductory immersion into how government is using it. Having been involved with startups for the past 15 years, iterative development is the norm, however, the layer of project management processes has forced me to be a better professional overall.
What I’ve found through many discussions and interviews is that you can’t just snap your fingers and execute agile within the framework of government bureaucracy. There are a number of issues — from procurement to project management training to executive-level commitment to organizational-wide culture change — that hinder its adoption. For IT, launching a new website or app is this easy part. Changing IT operational processes and culture is often overlooked or avoided, especially for a short-term executive, because they reach into the granular organizational challenges most people don’t want to bother with.
After talking with a number of agile government and private sector practitioners, it was clear there was enthusiasm around how it could be applied to fundamentally change the way government works. Beyond just execution from professional project management professionals, everyone I spoke with talked about how deploying agile gives them a stronger sense of public service.
What came from these discussions is the desire to have a stronger community of practitioners and those interested in deploying it to better support one another.
To meet that need, a group of federal, state, local government and private sector professionals have formed Agile for Gov, a “community-powered network of agile government professionals.”…

Monitoring Arms Control Compliance With Web Intelligence


Chris Holden and Maynard Holliday at Commons Lab: “Traditional monitoring of arms control treaties, agreements, and commitments has required the use of National Technical Means (NTM)—large satellites, phased array radars, and other technological solutions. NTM was a good solution when the treaties focused on large items for observation, such as missile silos or nuclear test facilities. As the targets of interest have shrunk by orders of magnitude, the need for other, more ubiquitous, sensor capabilities has increased. The rise in web-based, or cloud-based, analytic capabilities will have a significant influence on the future of arms control monitoring and the role of citizen involvement.
Since 1999, the U.S. Department of State has had at its disposal the Key Verification Assets Fund (V Fund), which was established by Congress. The Fund helps preserve critical verification assets and promotes the development of new technologies that support the verification of and compliance with arms control, nonproliferation, and disarmament requirements.
Sponsored by the V Fund to advance web-based analytic capabilities, Sandia National Laboratories, in collaboration with Recorded Future (RF), synthesized open-source data streams from a wide variety of traditional and nontraditional web sources in multiple languages along with topical texts and articles on national security policy to determine the efficacy of monitoring chemical and biological arms control agreements and compliance. The team used novel technology involving linguistic algorithms to extract temporal signals from unstructured text and organize that unstructured text into a multidimensional structure for analysis. In doing so, the algorithm identifies the underlying associations between entities and events across documents and sources over time. Using this capability, the team analyzed several events that could serve as analogs to treaty noncompliance, technical breakout, or an intentional attack. These events included the H7N9 bird flu outbreak in China, the Shanghai pig die-off and the fungal meningitis outbreak in the United States last year.
h7n9-for-blog
 
For H7N9 we found that open source social media were the first to report the outbreak and give ongoing updates.  The Sandia RF system was able to roughly estimate lethality based on temporal hospitalization and fatality reporting.  For the Shanghai pig die-off the analysis tracked the rapid assessment by Chinese authorities that H7N9 was not the cause of the pig die-off as had been originally speculated. Open source reporting highlighted a reduced market for pork in China due to the very public dead pig display in Shanghai. Possible downstream health effects were predicted (e.g., contaminated water supply and other overall food ecosystem concerns). In addition, legitimate U.S. food security concerns were raised based on the Chinese purchase of the largest U.S. pork producer (Smithfield) because of a fear of potential import of tainted pork into the United States….
To read the full paper, please click here.”

The Emergence of Government Innovation Teams


Hollie Russon Gilman at TechTank: “A new global currency is emerging.  Governments understand that people at home and abroad evaluate them based on how they use technology and innovative approaches in their service delivery and citizen engagement.  This raises opportunities, and critical questions about the role of innovation in 21st century governance.
Bloomberg Philanthropies and Nesta, the UK’s Innovation foundation, recently released a global report highlighting 20 government innovation teams.  Importantly, the study included teams that were established and funded by all levels of government (city, regional and national), and aims to find creative solutions to seemingly intractable solutions. This report features 20 teams across six continents and features some basic principles and commonalities that are instructive for all types of innovators, inside and outside, of government.
Using Government to Locally Engage
One of the challenges of representational democracy is that elected officials and government officials spend time in bureaucracies isolated from the very people they aim to serve.  Perhaps there can be different models.  For example, Seoul’s Innovation Bureau is engaging citizens to re-design and re-imagine public services.  Seoul is dedicated to becoming a Sharing City; including Tool Kit Centers where citizens can borrow machinery they would rarely use that would also benefit the whole community. This approach puts citizens at the center of their communities and leverages government to work for the people…
As I’ve outlined in a earlier TechTank post, there are institutional constraints for governments to try the unknown.  There are potential electoral costs, greater disillusionment, and gaps in vital service delivery. Yet, despite all of these barriers there are a variety of promising tools. For example, Finland has Sitra, an Innovation fund, whose mission is to foster experimentation to transform a diverse set of policy issues including sustainable energy and healthcare. Sitra invests in both the practical research and experiments to further public sector issues as well as invest in early stage companies.
We need a deeper understanding of the opportunities, and challenges, of innovation in government.    Luckily there are many researchers, think-tanks, and organizations beginning analysis.  For example, Professor and Associate Dean Anita McGahan, of the Rotman School of Management at the University of Toronto, calls for a more strategic approach toward understanding the use of innovation, including big data, in the public sector…”

App enables citizens to report water waste in drought regions


Springwise: “Rallying citizens to take a part in looking after the community they live in has become easier thanks to smartphones. In the past, the Creek Watch app has enabled anyone to help monitor their local water quality by sending data back to the state water board. Now Everydrop LA wants to use similar techniques to avoid drought in California, encouraging residents to report incidents of water wastage.
According to the team behind the app — which also created the CitySourced platform for engaging users in civic issues — even the smallest amount of water wastage can lead to meaningful losses over time. A faucet that drips just once a minute will lose over 2000 gallons of drinkable water each year. Using the Everydrop LA, citizens can report the location of leaking faucets and fire hydrants as well as occurrences of blatant water wastage. They can also see how much water is being wasted in their local area and learn about what they can do to cut their own water usage. In times when drought is a risk, the app notifies users to conserve. Cities and counties can use the data in their reports and learn more about how water wastage is affecting their jurisdiction.”

This Exercise App Tracks Trends on How We Move In Different Cities


Mark Byrnes at CityLab: “An app designed to encourage exercise can also tell us a lot about the way different cities get from point A to B.
The app, called Human, runs in the background of your iPhone, automatically detecting activities like walking, cycling, running, and motorized transport. The point is to encourage you to exercise for at least 30 minutes a day.
Almost a year since Human launched (last August), its developers have released stunning visualization of all that movement: 7.5 million miles traveled by their app users so far.
On their site, you can look into the mobility data inside 30 different cities. Once you click on one, you’ll be greeted with a pie chart that shows the distribution of activity within that city lined up against a pie chart that shows the international average.
In the case of Amsterdam, its transportation clichés are verified. App users in the bike-loving city use two wheels way more than they use four. And they walk about as much as anywhere else:

Human then shows the paths traveled by their users. When it comes to Amsterdam, the results look almost exactly like the city’s entire street grid, no matter what physical activity is being shown:

Selected Readings on Economic Impact of Open Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of open data was originally published in 2014.

Open data is publicly available data – often released by governments, scientists, and occasionally private companies – that is made available for anyone to use, in a machine-readable format, free of charge. Considerable attention has been devoted to the economic potential of open data for businesses and other organizations, and it is now widely accepted that open data plays an important role in spurring innovation, growth, and job creation. From new business models to innovation in local governance, open data is being quickly adopted as a valuable resource at many levels.

Measuring and analyzing the economic impact of open data in a systematic way is challenging, and governments as well as other providers of open data seek to provide access to the data in a standardized way. As governmental transparency increases and open data changes business models and activities in many economic sectors, it is important to understand best practices for releasing and using non-proprietary, public information. Costs, social challenges, and technical barriers also influence the economic impact of open data.

These selected readings are intended as a first step in the direction of answering the question of if we can and how we consider if opening data spurs economic impact.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Bonina, Carla. New Business Models and the Values of Open Data: Definitions, Challenges, and Opportunities. NEMODE 3K – Small Grants Call 2013. http://bit.ly/1xGf9oe

  • In this paper, Dr. Carla Bonina provides an introduction to open data and open data business models, evaluating their potential economic value and identifying future challenges for the effectiveness of open data, such as personal data and privacy, the emerging data divide, and the costs of collecting, producing and releasing open (government) data.

Carpenter, John and Phil Watts. Assessing the Value of OS OpenData™ to the Economy of Great Britain – Synopsis. June 2013. Accessed July 25, 2014. http://bit.ly/1rTLVUE

  • John Carpenter and Phil Watts of Ordnance Survey undertook a study to examine the economic impact of open data to the economy of Great Britain. Using a variety of methods such as case studies, interviews, downlad analysis, adoption rates, impact calculation, and CGE modeling, the authors estimates that the OS OpenData initiative will deliver a net of increase in GDP of £13 – 28.5 million for Great Britain in 2013.

Capgemini Consulting. The Open Data Economy: Unlocking Economic Value by Opening Government and Public Data. Capgemini Consulting. Accessed July 24, 2014. http://bit.ly/1n7MR02

  • This report explores how governments are leveraging open data for economic benefits. Through using a compariative approach, the authors study important open data from organizational, technological, social and political perspectives. The study highlights the potential of open data to drive profit through increasing the effectiveness of benchmarking and other data-driven business strategies.

Deloitte. Open Growth: Stimulating Demand for Open Data in the UK. Deloitte Analytics. December 2012. Accessed July 24, 2014. http://bit.ly/1oeFhks

  • This early paper on open data by Deloitte uses case studies and statistical analysis on open government data to create models of businesses using open data. They also review the market supply and demand of open government data in emerging sectors of the economy.

Gruen, Nicholas, John Houghton and Richard Tooth. Open for Business: How Open Data Can Help Achieve the G20 Growth Target.  Accessed July 24, 2014, http://bit.ly/UOmBRe

  • This report highlights the potential economic value of the open data agenda in Australia and the G20. The report provides an initial literature review on the economic value of open data, as well as a asset of case studies on the economic value of open data, and a set of recommendations for how open data can help the G20 and Australia achieve target objectives in the areas of trade, finance, fiscal and monetary policy, anti-corruption, employment, energy, and infrastructure.

Heusser, Felipe I. Understanding Open Government Data and Addressing Its Impact (draft version). World Wide Web Foundation. http://bit.ly/1o9Egym

  • The World Wide Web Foundation, in collaboration with IDRC has begun a research network to explore the impacts of open data in developing countries. In addition to the Web Foundation and IDRC, the network includes the Berkman Center for Internet and Society at Harvard, the Open Development Technology Alliance and Practical Participation.

Howard, Alex. San Francisco Looks to Tap Into the Open Data Economy. O’Reilly Radar: Insight, Analysis, and Reach about Emerging Technologies.  October 19, 2012.  Accessed July 24, 2014. http://oreil.ly/1qNRt3h

  • Alex Howard points to San Francisco as one of the first municipalities in the United States to embrace an open data platform.  He outlines how open data has driven innovation in local governance.  Moreover, he discusses the potential impact of open data on job creation and government technology infrastructure in the City and County of San Francisco.

Huijboom, Noor and Tijs Van den Broek. Open Data: An International Comparison of Strategies. European Journal of ePractice. March 2011. Accessed July 24, 2014.  http://bit.ly/1AE24jq

  • This article examines five countries and their open data strategies, identifying key features, main barriers, and drivers of progress for of open data programs. The authors outline the key challenges facing European, and other national open data policies, highlighting the emerging role open data initiatives are playing in political and administrative agendas around the world.

Manyika, J., Michael Chui, Diana Farrell, Steve Van Kuiken, Peter Groves, and Elizabeth Almasi Doshi. Open Data: Unlocking Innovation and Performance with Liquid Innovation. McKinsey Global Institute. October 2013. Accessed July 24, 2014.  http://bit.ly/1lgDX0v

  • This research focuses on quantifying the potential value of open data in seven “domains” in the global economy: education, transportation, consumer products, electricity, oil and gas, health care, and consumer finance.

Moore, Alida. Congressional Transparency Caucus: How Open Data Creates Jobs. April 2, 2014. Accessed July 30, 2014. Socrata. http://bit.ly/1n7OJpp

  • Socrata provides a summary of the March 24th briefing of the Congressional Transparency Caucus on the need to increase government transparency through adopting open data initiatives. They include key takeaways from the panel discussion, as well as their role in making open data available for businesses.

Stott, Andrew. Open Data for Economic Growth. The World Bank. June 25, 2014. Accessed July 24, 2014. http://bit.ly/1n7PRJF

  • In this report, The World Bank examines the evidence for the economic potential of open data, holding that the economic potential is quite large, despite a variation in the published estimates, and difficulties assessing its potential methodologically. They provide five archetypes of businesses using open data, and provides recommendations for governments trying to maximize economic growth from open data.

Request for Proposals: Exploring the Implications of Government Release of Large Datasets


“The Berkeley Center for Law & Technology and Microsoft are issuing this request for proposals (RFP) to fund scholarly inquiry to examine the civil rights, human rights, security and privacy issues that arise from recent initiatives to release large datasets of government information to the public for analysis and reuse.  This research may help ground public policy discussions and drive the development of a framework to avoid potential abuses of this data while encouraging greater engagement and innovation.
This RFP seeks to:

    • Gain knowledge of the impact of the online release of large amounts of data generated by citizens’ interactions with government
    • Imagine new possibilities for technical, legal, and regulatory interventions that avoid abuse
    • Begin building a body of research that addresses these issues

– BACKGROUND –

 
Governments at all levels are releasing large datasets for analysis by anyone for any purpose—“Open Data.”  Using Open Data, entrepreneurs may create new products and services, and citizens may use it to gain insight into the government.  A plethora of time saving and other useful applications have emerged from Open Data feeds, including more accurate traffic information, real-time arrival of public transportation, and information about crimes in neighborhoods.  Sometimes governments release large datasets in order to encourage the development of unimagined new applications.  For instance, New York City has made over 1,100 databases available, some of which contain information that can be linked to individuals, such as a parking violation database containing license plate numbers and car descriptions.
Data held by the government is often implicitly or explicitly about individuals—acting in roles that have recognized constitutional protection, such as lobbyist, signatory to a petition, or donor to a political cause; in roles that require special protection, such as victim of, witness to, or suspect in a crime; in the role as businessperson submitting proprietary information to a regulator or obtaining a business license; and in the role of ordinary citizen.  While open government is often presented as an unqualified good, sometimes Open Data can identify individuals or groups, leading to a more transparent citizenry.  The citizen who foresees this growing transparency may be less willing to engage in government, as these transactions may be documented and released in a dataset to anyone to use for any imaginable purpose—including to deanonymize the database—forever.  Moreover, some groups of citizens may have few options or no choice as to whether to engage in governmental activities.  Hence, open data sets may have a disparate impact on certain groups. The potential impact of large-scale data and analysis on civil rights is an area of growing concern.  A number of civil rights and media justice groups banded together in February 2014 to endorse the “Civil Rights Principles for the Era of Big Data” and the potential of new data systems to undermine longstanding civil rights protections was flagged as a “central finding” of a recent policy review by White House adviser John Podesta.
The Berkeley Center for Law & Technology (BCLT) and Microsoft are issuing this request for proposals in an effort to better understand the implications and potential impact of the release of data related to U.S. citizens’ interactions with their local, state and federal governments. BCLT and Microsoft will fund up to six grants, with a combined total of $300,000.  Grantees will be required to participate in a workshop to present and discuss their research at the Berkeley Technology Law Journal (BTLJ) Spring Symposium.  All grantees’ papers will be published in a dedicated monograph.  Grantees’ papers that approach the issues from a legal perspective may also be published in the BTLJ. We may also hold a followup workshop in New York City or Washington, DC.
While we are primarily interested in funding proposals that address issues related to the policy impacts of Open Data, many of these issues are intertwined with general societal implications of “big data.” As a result, proposals that explore Open Data from a big data perspective are welcome; however, proposals solely focused on big data are not.  We are open to proposals that address the following difficult question.  We are also open to methods and disciplines, and are particularly interested in proposals from cross-disciplinary teams.

    • To what extent does existing Open Data made available by city and state governments affect individual profiling?  Do the effects change depending on the level of aggregation (neighborhood vs. cities)?  What releases of information could foreseeably cause discrimination in the future? Will different groups in society be disproportionately impacted by Open Data?
    • Should the use of Open Data be governed by a code of conduct or subject to a review process before being released? In order to enhance citizen privacy, should governments develop guidelines to release sampled or perturbed data, instead of entire datasets? When datasets contain potentially identifiable information, should there be a notice-and-comment proceeding that includes proposed technological solutions to anonymize, de-identify or otherwise perturb the data?
    • Is there something fundamentally different about government services and the government’s collection of citizen’s data for basic needs in modern society such as power and water that requires governments to exercise greater due care than commercial entities?
    • Companies have legal and practical mechanisms to shield data submitted to government from public release.  What mechanisms do individuals have or should have to address misuse of Open Data?  Could developments in the constitutional right to information policy as articulated in Whalen and Westinghouse Electric Co address Open Data privacy issues?
    • Collecting data costs money, and its release could affect civil liberties.  Yet it is being given away freely, sometimes to immensely profitable firms.  Should governments license data for a fee and/or impose limits on its use, given its value?
    • The privacy principle of “collection limitation” is under siege, with many arguing that use restrictions will be more efficacious for protecting privacy and more workable for big data analysis.  Does the potential of Open Data justify eroding state and federal privacy act collection limitation principles?   What are the ethical dimensions of a government system that deprives the data subject of the ability to obscure or prevent the collection of data about a sensitive issue?  A move from collection restrictions to use regulation raises a number of related issues, detailed below.
    • Are use restrictions efficacious in creating accountability?  Consumer reporting agencies are regulated by use restrictions, yet they are not known for their accountability.  How could use regulations be implemented in the context of Open Data efficaciously?  Can a self-learning algorithm honor data use restrictions?
    • If an Open Dataset were regulated by a use restriction, how could individuals police wrongful uses?   How would plaintiffs overcome the likely defenses or proof of facts in a use regulation system, such as a burden to prove that data were analyzed and the product of that analysis was used in a certain way to harm the plaintiff?  Will plaintiffs ever be able to beat first amendment defenses?
    • The President’s Council of Advisors on Science and Technology big data report emphasizes that analysis is not a “use” of data.  Such an interpretation suggests that NSA metadata analysis and large-scale scanning of communications do not raise privacy issues.  What are the ethical and legal implications of the “analysis is not use” argument in the context of Open Data?
    • Open Data celebrates the idea that information collected by the government can be used by another person for various kinds of analysis.  When analysts are not involved in the collection of data, they are less likely to understand its context and limitations.  How do we ensure that this knowledge is maintained in a use regulation system?
    • Former President William Clinton was admitted under a pseudonym for a procedure at a New York Hospital in 2004.  The hospital detected 1,500 attempts by its own employees to access the President’s records.  With snooping such a tempting activity, how could incentives be crafted to cause self-policing of government data and the self-disclosure of inappropriate uses of Open Data?
    • It is clear that data privacy regulation could hamper some big data efforts.  However, many examples of big data successes hail from highly regulated environments, such as health care and financial services—areas with statutory, common law, and IRB protections.  What are the contours of privacy law that are compatible with big data and Open Data success and which are inherently inimical to it?
    • In recent years, the problem of “too much money in politics” has been addressed with increasing disclosure requirements.  Yet, distrust in government remains high, and individuals identified in donor databases have been subjected to harassment.  Is the answer to problems of distrust in government even more Open Data?
    • What are the ethical and epistemological implications of encouraging government decision-making based upon correlation analysis, without a rigorous understanding of cause and effect?  Are there decisions that should not be left to just correlational proof? While enthusiasm for data science has increased, scientific journals are elevating their standards, with special scrutiny focused on hypothesis-free, multiple comparison analysis. What could legal and policy experts learn from experts in statistics about the nature and limits of open data?…
      To submit a proposal, visit the Conference Management Toolkit (CMT) here.
      Once you have created a profile, the site will allow you to submit your proposal.
      If you have questions, please contact Chris Hoofnagle, principal investigator on this project.”