Doctors’ Individual Opioid Prescription ‘Report Cards’ Show Impact


Scott Calvert at the Wall Street Journal: “Several states, including Arizona, Kentucky and Ohio, are using their state prescription monitoring databases to send doctors individualized “report cards” that show how their prescribing of addictive opioids and other drugs compares with their peers.

“Arizona probably has the most complete one out there right now—it’s pretty impressive,” said Patrick Knue, director of the Prescription Drug Monitoring Program Training and Technical Assistance Center at Brandeis University, which helps states improve their databases.

Arizona’s quarterly reports rate a doctor’s prescribing of oxycodone and certain other drugs as normal, high, severe or extreme compared with the state’s other doctors in his medical specialty.

During a two-year pilot program, the number of opiate prescriptions fell 10% in five counties while rising in other counties, said Dean Wright, former head of the state’s prescription-monitoring program. The report cards also contributed to a 4% drop in overdose deaths in the pilot counties, he said.

The state now issues the report cards statewide and in June sent notices to more than 13,000 doctors statewide. Mr. Wright said the message is clear: “Stop and think about what you’re prescribing and the impact it can have.”
The report cards list statistics such as how many of a doctor’s patients received controlled substances from five or more doctors. Elizabeth Dodge, Mr. Wright’s successor, said some doctors ask for the patients’ names—information they might have gleaned from the database….(More)”

Open data, transparency and accountability


Topic guide by Liz Carolan: “…introduces evidence and lessons learned about open data, transparency and accountability in the international development context. It discusses the definitions, theories, challenges and debates presented by the relationship between these concepts, summarises the current state of open data implementation in international development, and highlights lessons and resources for designing and implementing open data programmes.

Open data involves the release of data so that anyone can access, use and share it. The Open DataCharter (2015) describes six principles that aim to make data easier to find, use and combine:

  • open by default
  • timely and comprehensive
  • accessible and usable
  • comparable and interoperable
  • for improved governance and citizen engagement
  • for inclusive development and innovation

One of the main objectives of making data open is to promote transparency.

Transparency is a characteristic of government, companies, organisations and individuals that are open in the clear disclosure of information, rules, plans, processes and actions. Trans­parency of information is a crucial part of this. Within a development context, transparency and accountability initiatives have emerged over the last decade as a way to address developmental failures and democratic deficits.

There is a strong intersection between open data and transparency as concepts, yet as fields of study and practice, they have remained somewhat separate. This guide draws extensively on analysis and evidence from both sets of literature, beginning by outlining the main concepts and the theories behind the relationships between them.

Data release and transparency are parts of the chain of events leading to accountability.  For open data and transparency initiatives to lead to accountability, the required conditions include:

  • getting the right data published, which requires an understanding of the politics of data publication
  • enabling actors to find, process and use information, and to act on any outputs, which requires an accountability ecosystem that includes equipped and empowered intermediaries
  • enabling institutional or social forms of enforceability or citizens’ ability to choose better services,which requires infrastructure that can impose sanctions, or sufficient choice or official support for citizens

Programmes intended to increase access to information can be impacted by and can affect inequality. They can also pose risks to privacy and may enable the misuse of data for the exploitation of individuals and markets.

Despite a range of international open data initiatives and pressures, developing countries are lagging behind in the implementation of reforms at government level, in the overall availability of data, and in the use of open data for transparency and accountability. What is more, there are signs that ‘open-washing’ –superficial efforts to publish data without full integration with transparency commitments – may be obscuring backsliding in other aspects of accountability.

The topic guide pulls together lessons and guidance from open data, transparency and accountability work,including an outline of technical and non-technical aspects of implementing a government open data initiative. It also lists further resources, tools and guidance….(More)”

Data Driven Governments: Creating Value Through Open Government Data


Chapter by Judie Attard , Fabrizio Orlandi and Sören Auer in Transactions on Large-Scale Data- and Knowledge-Centered Systems XXVII: “Governments are one of the largest producers and collectors of data in many different domains and one major aim of open government data initiatives is the release of social and commercial value. Hence, we here explore existing processes of value creation on government data. We identify the dimensions that impact, or are impacted by value creation, and distinguish between the different value creating roles and participating stakeholders. We propose the use of Linked Data as an approach to enhance the value creation process, and provide a Value Creation Assessment Framework to analyse the resulting impact. We also implement the assessment framework to evaluate two government data portals….(More)”

Big Data and Public Policy: Can It Succeed Where E-Participation Has Failed?


Jonathan Bright and Helen Margetts at Policy & Society: “This editorial introduces a special issue resulting from a panel on Internet and policy organized by the Oxford Internet Institute (University of Oxford) at the 2015 International Conference on Public Policy (ICPP) held in Milan. Two main themes emerged from the panel: the challenges of high cost and low participation which many e-participation initiatives have faced; and the potential Big Data seems to hold for remedying these problems. This introduction briefly presents these themes and links them to the papers in the issue. It argues that Big Data can fix some of the problems typically encountered by e-participation initiatives: it can offer a solution to the problem of low turnout which is furthermore accessible to government bodies even if they have low levels of financial resources. However, the use of Big Data in this way is also a radically different approach to the problem of involving citizens in policymaking; and the editorial concludes by reflecting on the significance of this for the policymaking process….(More)”

“Big Data Europe” addresses societal challenges with data technologies


Press Release: “Across society, from health to agriculture and transport, from energy to climate change and security, practitioners in every discipline recognise the potential of the enormous amounts of data being created every day. The challenge is to capture, manage and process that information to derive meaningful results and make a difference to people’s lives. The Big Data Europe project has just released the first public version of its open source platform designed to do just that. In 7 pilot studies, it is helping to solve societal challenges by putting cutting edge technology in the hands of experts in fields other than IT.

Although many crucial big data technologies are freely available as open source software, they are often difficult for non-experts to integrate and deploy. Big Data Europe solves that problem by providing a package that can readily be installed locally or at any scale in a cloud infrastructure by a systems administrator, and configured via a simple user interface. Tools like Apache Hadoop, Apache Spark, Apache Flink and many others can be instantiated easily….

The tools included in the platform were selected after a process of requirements-gathering across the seven societal challenges identified by the European Commission (Health, Food, Energy, Transport, Climate, Social Sciences and Security). Tasks like message passing are handled using Kafka and Flume, storage by Hive and Cassandra, or publishing through geotriples. The platform uses the Docker system to make it easy to add new tools and, again, for them to operate at a scale limited only by the computing infrastructure….

The platform can be downloaded from GitHub.
See also the installation instructions, Getting Started and video.”

What is being done with open government data?


An exploratory analysis of public uses of New York City open data by Karen Okamoto in Webology: “In 2012, New York City Council passed legislation to make government data open and freely available to the public. By approving this legislation, City Council was attempting to make local government more transparent, accountable, and streamlined in its operations. It was also attempting to create economic opportunities and to encourage the public to identify ways in which to improve government and local communities. The purpose of this study is to explore public uses of New York City open data. Currently, more than 1300 datasets covering broad areas such as health, education, transportation, public safety, housing and business are available on the City’s Open Data Portal. This study found a plethora of maps, visualizations, tools, apps and analyses made by the public using New York City open data. Indeed, open data is inspiring a productive range of creative reuses yet questions remain concerning how useable the data is for users without technical skills and resources….(More)”

Data for Policy: Data Science and Big Data in the Public Sector


Innar Liiv at OXPOL: “How can big data and data science help policy-making? This question has recently gained increasing attention. Both the European Commission and the White House have endorsed the use of data for evidence-based policy making.

Still, a gap remains between theory and practice. In this blog post, I make a number of recommendations for systematic development paths.

RESEARCH TRENDS SHAPING DATA FOR POLICY

‘Data for policy’ as an academic field is still in its infancy. A typology of the field’s foci and research areas are summarised in the figure below.

 

diagram1

 

Besides the ‘data for policy’ community, there are two important research trends shaping the field: 1) computational social science; and 2) the emergence of politicised social bots.

Computational social science (CSS) is an new interdisciplinary research trend in social science, which tries to transform advances in big data and data science into research methodologies for understanding, explaining and predicting underlying social phenomena.

Social science has a long tradition of using computational and agent-based modelling approaches (e.g.Schelling’s Model of Segregation), but the new challenge is to feed real-life, and sometimes even real-time information into those systems to get gain rapid insights into the validity of research hypotheses.

For example, one could use mobile phone call records to assess the acculturation processes of different communities. Such a project would involve translating different acculturation theories into computational models, researching the ethical and legal issues inherent in using mobile phone data and developing a vision for generating policy recommendations and new research hypothesis from the analysis.

Politicised social bots are also beginning to make their mark. In 2011, DARPA solicited research proposals dealing with social media in strategic communication. The term ‘political bot’ was not used, but the expected results left no doubt about the goals…

The next wave of e-government innovation will be about analytics and predictive models.  Taking advantage of their potential for social impact will require a solid foundation of e-government infrastructure.

The most important questions going forward are as follows:

  • What are the relevant new data sources?
  • How can we use them?
  • What should we do with the information? Who cares? Which political decisions need faster information from novel sources? Do we need faster information? Does it come with unanticipated risks?

These questions barely scratch the surface, because the complex interplay between general advancements of computational social science and hovering satellite topics like political bots will have an enormous impact on research and using data for policy. But, it’s an important start….(More)”

Can Direct Democracy Be Revived Through New Voting Apps?


Adele Peters at FastCo-Exist: “…a new app and proposed political party called MiVote—aims to rethink how citizens participate in governance. Instead of voting only in elections, people using the app can share their views on every issue the government considers. The idea is that parliamentary representatives of the “MiVote party” would commit to support legislation only when it’s in line with the will of the app’s members—regardless of the representative’s own opinion….

Like Democracy Earth, a nonprofit that started in Argentina, MiVote uses the blockchain to make digital voting and identity fully secure. Democracy Earth also plans to use a similar model of representation, running candidates who promise to adhere to the results of online votes rather than a particular ideology.

But MiVote takes a somewhat different approach to gathering opinions. The app will give users a notification when a new issue is addressed in the Australian parliament. Then, voters get access to a digital “information packet,” compiled by independent researchers, that lets them dive into four different approaches.

“We don’t talk about the bill or the legislation at all,” says Jacoby. “If you put it into a business context, the bill or the legislation is the contract. In no business would you write the contract before you know what the deal looks like. If we’re looking for genuine democracy, the bill has to be determined by the people . . . Once we know where the people want to go, then we focus on making sure the bill gets us there.”

If the parliament is going to vote about immigration, for example, you might get details about a humanitarian approach, a border security approach, a financially pragmatic approach, and an approach that focuses on international relations. For each frame of reference, the app lets you dive into as much information as you need to decide. If you don’t read anything, it won’t let you cast a vote.

“We’re much more interested in a solutions-oriented approach rather than an ideological approach,” he says. “Ideology basically says I have the answer for you before you’ve even asked the question. There is no ideology, no worldview, that has the solution to everything that ails us.”

Representatives of this hypothetical new party won’t have to worry about staying on message, because there is no message; the only goal is to vote after the people speak. That might free politicians to focus on solutions rather than their image…(More)”

Against transparency


 at Vox: “…Digital storage is pretty cheap and easy, so maybe the next step in open government is ubiquitous surveillance of public servants paired with open access to the recordings.

As a journalist and an all-around curious person, I can’t deny there’s something appealing about this.

Historians, too, would surely love to know everything that President Obama and his top aides said to one another regarding budget negotiations with John Boehner rather than needing to rely on secondhand news accounts influenced by the inevitable demands of spin. By the same token, historians surely would wish that there were a complete and accurate record of what was said at the Constitutional Convention in 1787 that, instead, famously operated under a policy of anonymous discussions.

But we should be cautioned by James Madison’s opinion that “no Constitution would ever have been adopted by the convention if the debates had been public.”

His view, which seems sensible, is that public or recorded debates would have been simply exercises in position-taking rather than deliberation, with each delegate playing to his base back home rather than working toward a deal.

“Had the members committed themselves publicly at first, they would have afterwards supposed consistency required them to maintain their ground,” Madison wrote, “whereas by secret discussion no man felt himself obliged to retain his opinions any longer than he was satisfied of their propriety and truth, and was open to the force of argument.”

The example comes to me by way of Cass Sunstein, who formerly held a position as a top regulatory czar in Obama’s White House, and who delivered a fascinating talk on the subject of government transparency at a June 2016 Columbia symposium on the occasion of the anniversary of the Freedom of Information Act.

Sunstein asks us to distinguish between disclosure of the government’s outputs and disclosure of the government’s inputs. Output disclosure is something like the text of the Constitution or when the Obama administration had Medicare change decades of practice and begin publishing information about what Medicare pays to hospitals and other health providers.

Input disclosure would be something like the transcript of the debates at the Constitutional Convention or a detailed record of the arguments inside the Obama administration over whether to release the Medicare data. Sunstein’s argument is that it is a mistake to simply conflate the two ideas of disclosure under one broad heading of “transparency” when considerations around the two are very different.

Public officials need to have frank discussions

The fundamental problem with input disclosure is that in addition to serving as a deterrent to misconduct, it serves as a deterrent to frankness and honesty.

There are a lot of things that colleagues might have good reason to say to one another in private that would nonetheless be very damaging if they went viral on Facebook:

  • Healthy brainstorming processes often involve tossing out bad or half-baked ideas in order to stimulate thought and elevate better ones.
  • A realistic survey of options may require a blunt assessment of the strengths and weaknesses of different members of the team or of outside groups that would be insulting if publicized.
  • Policy decisions need to be made with political sustainability in mind, but part of making a politically sustainable policy decision is you don’t come out and say you made the decision with politics in mind.
  • Someone may want to describe an actual or potential problem in vivid terms to spur action, without wanting to provoke public panic or hysteria through public discussion.
  • If a previously embarked-upon course of action isn’t working, you may want to quietly change course rather than publicly admit failure.

Journalists are, of course, interested in learning about all such matters. But it’s precisely because such things are genuinely interesting that making disclosure inevitable is risky.

Ex post facto disclosure of discussions whose participants didn’t realize they would be disclosed would be fascinating and useful. But after a round or two of disclosure, the atmosphere would change. Instead of peeking in on a real decision-making process, you would have every meeting dominated by the question “what will this look like on the home page of Politico?”…(More)”

Encouraging and Sustaining Innovation in Government: Technology and Innovation in the Next Administration


New report by Beth Simone Noveck and Stefaan Verhulst: “…With rates of trust in government at an all-time low, technology and innovation will be essential to achieve the next administration’s goals and to deliver services more effectively and efficiently. The next administration must prioritize using technology to improve governing and must develop plans to do so in the transition… This paper provides analysis and a set of concrete recommendations, both for the period of transition before the inauguration, and for the start of the next presidency, to encourage and sustain innovation in government. Leveraging the insights from the experts who participated in a day-long discussion, we endeavor to explain how government can improve its use of using digital technologies to create more effective policies, solve problems faster and deliver services more effectively at the federal, state and local levels….

The broad recommendations are:

  • Scale Data Driven Governance: Platforms such as data.gov represent initial steps in the direction of enabling data-driven governance. Much more can be done, however, to open-up data and for the agencies to become better consumers of data, to improve decision-making and scale up evidence-based governance. This includes better use of predictive analytics, more public engagement; and greater use of cutting-edge methods like machine learning.
  • Scale Collaborative Innovation: Collaborative innovation takes place when government and the public work together, thus widening the pool of expertise and knowledge brought to bear on public problems. The next administration can reach out more effectively, not just to the public at large, but to conduct targeted outreach to public officials and citizens who possess the most relevant skills or expertise for the problems at hand.
  • Promote a Culture of Innovation: Institutionalizing a culture of technology-enabled innovation will require embedding and institutionalizing innovation and technology skills more widely across the federal enterprise. For example, contracting, grants and personnel officials need to have a deeper understanding of how technology can help them do their jobs more efficiently, and more people need to be trained in human-centered design, gamification, data science, data visualization, crowdsourcing and other new ways of working.
  • Utilize Evidence-Based Innovation: In order to better direct government investments, leaders need a much better sense of what works and what doesn’t. The government spends billions on research in the private and university sectors, but very little experimenting with, testing, and evaluating its own programs. The next administration should continue developing an evidence-based approach to governance, including a greater use of methods like A/B testing (a method of comparing two versions of a webpage or app against each other to determine which one performs the best); establishing a clearinghouse for success and failure stories and best practices; and encouraging overseers to be more open to innovation.
  • Make Innovation a Priority in the Transition: The transition period represents a unique opportunity to seed the foundations for long-lasting change. By explicitly incorporating innovation into the structure, goals and activities of the transition teams, the next administration can get a fast start in implementing policy goals and improving government operations through innovation approaches….(More)”