Health Citizenship: A New Social Contract To Improve The Clinical Trial Process


Essay by Cynthia Grossman  and Tanisha Carino: “…We call this new social contract health citizenship, which includes a set of implied rights and responsibilities for all parties.

Three fundamental truths underpin our efforts:

  1. The path to better health and the advancement of science begin and end with engaged patients.
  2. The biomedical research enterprise lives all around us — in clinical trials, the data in our wearables, electronic health records, and data used for payment.
  3. The stakeholders that fuel advancement — clinicians, academia, government, the private sector, and investors — must create a system focused on speeding medical research and ensuring that patients have appropriate access to treatments.

To find tomorrow’s cures, treatments, and prevention measures, every aspect of society needs to get involved. Health citizenship recognizes that the future of innovative research and development depends on both patients and the formal healthcare system stepping up to the plate.

Moving Toward A Culture Of Transparency  

Increasing clinical trials registration and posting of research results are steps in the direction of transparency. Access to information about clinical trials — enrollment criteria, endpoints, locations, and results — is critical to empowering patients, their families, and primary care physicians. Also, transparency has a cascading impact on the cost and speed of scientific discovery, through ensuring validation and reproducibility of results…..

Encouraging Data Sharing

Data is the currency of biomedical research, and now patients are poised to contribute more of it than ever. In fact, many patients who participate in clinical research expect that their data will be shared and want to be partners, not just participants, in how data is used to advance the science and clinical practice that impact their disease or condition.

Engaging more patients in data sharing is only one part of what is needed to advance a data-sharing ecosystem. The National Academies of Science, Engineering, and Medicine (formerly the Institute of Medicine) conducted a consensus study that details the challenges to clinical trial data sharing. Out of that study spun a new data-sharing platform, Vivli, which will publicly launch this year. The New England Journal of Medicine took an important step toward demonstrating the value of sharing clinical trial data through its SPRINT Data Challenge, where it opened up a data set and supported projects that sought to derive new insights from the existing data. Examples like these will go a long way toward demonstrating the value of data sharing to advancing science, academic careers, and, most importantly, patient health.

As the technology to share clinical trial data improves, it will become less of an impediment than aligning incentives. The academic environment incentivizes researchers through first author and top-tier journal publications, which contribute to investigators holding on to clinical trial data. A recent publication suggests a way to ensure academic credit, through publication credit, for sharing data sets and allows investigators to tag data sets with unique IDs.

While this effort could assist in incentivizing data sharing, we see the value of tagging data sets as a way to rapidly gather examples of the value of data sharing, including what types of data sets are taken up for analysis and what types of analyses or actions are most valuable. This type of information is currently missing, and, without the value proposition, it is difficult to encourage data sharing behavior.

The value of clinical trial data will need to be collectively reexamined through embracing the sharing of data both across clinical trials and combined with other types of data. Similar to the airline and car manufacturing industries sharing data in support of public safety,7 as more evidence is gathered to support the impact of clinical trial data sharing and as the technology is developed to do this safely and securely, the incentives, resources, and equity issues will need to be addressed collectively…(More)”.

Data sharing in PLOS ONE: An analysis of Data Availability Statements


Lisa M. Federer et al at PLOS One: “A number of publishers and funders, including PLOS, have recently adopted policies requiring researchers to share the data underlying their results and publications. Such policies help increase the reproducibility of the published literature, as well as make a larger body of data available for reuse and re-analysis. In this study, we evaluate the extent to which authors have complied with this policy by analyzing Data Availability Statements from 47,593 papers published in PLOS ONE between March 2014 (when the policy went into effect) and May 2016. Our analysis shows that compliance with the policy has increased, with a significant decline over time in papers that did not include a Data Availability Statement. However, only about 20% of statements indicate that data are deposited in a repository, which the PLOS policy states is the preferred method. More commonly, authors state that their data are in the paper itself or in the supplemental information, though it is unclear whether these data meet the level of sharing required in the PLOS policy. These findings suggest that additional review of Data Availability Statements or more stringent policies may be needed to increase data sharing….(More)”.

Creating a Machine Learning Commons for Global Development


Blog by Hamed Alemohammad: “Advances in sensor technology, cloud computing, and machine learning (ML) continue to converge to accelerate innovation in the field of remote sensing. However, fundamental tools and technologies still need to be developed to drive further breakthroughs and to ensure that the Global Development Community (GDC) reaps the same benefits that the commercial marketplace is experiencing. This process requires us to take a collaborative approach.

Data collaborative innovation — that is, a group of actors from different data domains working together toward common goals — might hold the key to finding solutions for some of the global challenges that the world faces. That is why Radiant.Earth is investing in new technologies such as Cloud Optimized GeoTiffsSpatial Temporal Asset Catalogues (STAC), and ML. Our approach to advance ML for global development begins with creating open libraries of labeled images and algorithms. This initiative and others require — and, in fact, will thrive as a result of — using a data collaborative approach.

“Data is only as valuable as the decisions it enables.”

This quote by Ion Stoica, professor of computer science at the University of California, Berkeley, may best describe the challenge facing those of us who work with geospatial information:

How can we extract greater insights and value from the unending tsunami of data that is before us, allowing for more informed and timely decision making?…(More).

Redefining ‘impact’ so research can help real people right away, even before becoming a journal article


Perhaps nowhere is impact of greater importance than in my own fields of ecology and conservation science. Researchers often conduct this work with the explicit goal of contributing to the restoration and long-term survival of the species or ecosystem in question. For instance, research on an endangered plant can help to address the threats facing it.

But scientific impact is a very tricky concept. Science is a process of inquiry; it’s often impossible to know what the outcomes will be at the start. Researchers are asked to imagine potential impacts of their work. And people who live and work in the places where the research is conducted may have different ideas about what impact means.

In collaboration with several Bolivian colleagues, I studied perceptions of research and its impact in a highly biodiverse area in the Bolivian Amazon. We found that researchers – both foreign-based and Bolivian – and people living and working in the area had different hopes and expectations about what ecological research could help them accomplish…

Eighty-three percent of researchers queried told us their work had implications for management at community, regional and national levels rather than at the international level. For example, knowing the approximate populations of local primate species can be important for communities who rely on the animals for food and ecotourism.

But the scale of relevance didn’t necessarily dictate how researchers actually disseminated the results of their work. Rather, we found that the strongest predictor of how and with whom a researcher shared their work was whether they were based at a foreign or national institution. Foreign-based researchers had extremely low levels of local, regional or even national dissemination. However, they were more likely than national researchers to publish their findings in the international literature….

Rather than impact being addressed at the end of research, societal impacts can be part of the first stages of a study. For example, people living in the region where data is to be collected might have insight into the research questions being investigated; scientists need to build in time and plan ways to ask them. Ecological fieldwork presents many opportunities for knowledge exchange, new ideas and even friendships between different groups. Researchers can take steps to engage more directly with community life, such as by taking a few hours to teach local school kids about their research….(More)”.

Transforming the Future: Anticipation in the 21st Century


Open Access book by Riel Miller: “People are using the future to search for better ways to achieve sustainability, inclusiveness, prosperity, well-being and peace. In addition, the way the future is understood and used is changing in almost all domains, from social science to daily life.

This book presents the results of significant research undertaken by UNESCO with a number of partners to detect and define the theory and practice of anticipation around the world today. It uses the concept of ‘Futures Literacy’ as a tool to define the understanding of anticipatory systems and processes – also known as the Discipline of Anticipation. This innovative title explores:

  • new topics such as Futures Literacy and the Discipline of Anticipation;
  • the evidence collected from over 30 Futures Literacy Laboratories and presented in 14 full case studies;
  • the need and opportunity for significant innovation in human decision-making systems.

This book will be of great interest to scholars, researchers, policy-makers and students, as well as activists working on sustainability issues and innovation, future studies and anticipation studies….(More)”.

China asserts firm grip on research data


ScienceMag: “In a move few scientists anticipated, the Chinese government has decreed that all scientific data generated in China must be submitted to government-sanctioned data centers before appearing in publications. At the same time, the regulations, posted last week, call for open access and data sharing.

The possibly conflicting directives puzzle researchers, who note that the yet-to-be-established data centers will have latitude in interpreting the rules. Scientists in China can still share results with overseas collaborators, says Xie Xuemei, who specializes in innovation economics at Shanghai University. Xie also believes that the new requirements to register data with authorities before submitting papers to journals will not affect most research areas. Gaining approval could mean publishing delays, Xie says, but “it will not have a serious impact on scientific research.”

The new rules, issued by the powerful State Council, apply to all groups and individuals generating research data in China. The creation of a national data center will apparently fall to the science ministry, though other ministries and local governments are expected to create their own centers as well. Exempted from the call for open access and sharing are data involving state and business secrets, national security, “public interest,” and individual privacy… (More)”

5 Tips for Launching (and Sustaining) a City Behavioral Design Team


Playbook by ideas42: “…To pave the way for other municipalities to start a Behavioral Design Team, we distilled years of rigorously tested results and real-world best practices into an open-source playbook for public servants at all levels of government. The playbook introduces readers to core concepts of behavioral design, indicates why and where a BDT can be effective, lays out the fundamental competencies and structures governments will need to set up a BDT, and provides guidance on how to successfully run one. It also includes several applicable examples from our New York and Chicago teams to illustrate the tangible impact behavioral science can have on citizens and outcomes.

Thinking about starting a BDT? Here are five tips for launching (and sustaining) a city behavioral design team. For more insights, read the full playbook.

Compose your team with care

While there is no exact formula, a well-staffed BDT needs expertise in three key areas: behavioral science, research and evaluation, and public policies and programs. You’ll rarely find all three in one person—hence the need to gather a team of people with complementary skills. Some key things to look for as you assemble your team: background in behavioral economics or social psychology, formal training in impact evaluation and statistics, and experience working in government positions or nonprofits that implement government programs.

Choose an anchor agency

To more quickly build momentum, consider identifying an “anchor” agency. A high profile partner can help you establish credibility and can facilitate interactions with different departments across your government. Having an anchor agency legitimizes the BDT and helps reduce any apprehension among other agencies. The initial projects with the anchor agency will help others understand both what it means to work with the BDT and what kinds of outcomes to expect.

Establish your criteria for selecting projects

Once you get people bought-in and excited about innovating with behavioral science, the possible problems to tackle can seem limitless. Before selecting projects, set up clear criteria for prioritizing which problems need attention the most and which ones are best suited to behavioral solutions. While it is natural for the exact criteria to vary from place to place, in the playbook we share the criteria the New York and Chicago BDTs use to prioritize and determine the viability of potential undertakings that other teams can use as a starting place.

Build buy-in with a mix of project types

If you run only RCTs, which require implementation and data collection, it may be challenging to generate the buy-in and enthusiasm a BDT needs to thrive in its early days. That’s why incorporating some shorter engagements, including projects that are design-only, or pre-post evaluations can help sustain momentum by quickly generating evidence—and demonstrate that your BDT gets results.

Keep learning and growing

Applying behavioral design within government programs is still relatively novel. This open-source playbook provides guidance for starting a BDT, but constant learning and iterating should be expected! As BDTs mature and evolve, they must also become more ambitious in their scope, particularly when the low-hanging-fruit or other more obvious problems that can be helpful for building buy-in and establishing proof-of-concept have been addressed. The long-term goal of any successful BDT is to tackle the most challenging and impactful problems in government programs and policies head-on and use the solutions to help the people who need it most…(More)”

Can government stop losing its mind?


Report by Gavin Starks: “Can government remember? Is it condemned to repeat mistakes? Or does it remember too much and so see too many reasons why anything new is bound to fail?

While we are at the beginnings of a data revolution, we are also at a point where the deluge of data is creating the potential for an ‘information collapse’ in complex administrations: structured information and knowledge is lost in the noise or, worse, misinformation rises as fact.

There are many reasons for this: the technical design of systems, turnover of people, and contracting out. Information is stored in silos and often guarded jealously. Cultural and process issues lead to poor use of technologies. Knowledge is both formal (codified) and informal (held in people’s brains). The greatest value will be unlocked by combining these with existing and emerging tools.

This report sets out how the public sector could benefit from a federated, data-driven approach: one that provides greater power to its leaders, benefits its participants and users, and improves performance through better use of, and structured access to, data.

The report explores examples from the Open Data Institute, Open Banking Standard, BBC Archives, Ministry of Justice, NHS Blood and Transplant, Defence Science and Technology Laboratory and Ministry of Defence.

Recommendations:

  1. Design for open; build for search
  2. Build reciprocity into data supply chains
  3. Develop data ethics standards that can evolve at pace
  4. Create a Digital Audit Office
  5. Develop and value a culture of network thinking

To shorten the path between innovation and policy in a way that is repeatable and scalable, the report proposes six areas of focus be considered in any implementation design.

  1. Policy Providing strategic leadership and governance; framing and analysing economic, legal and regulatory impacts (e.g. GDPR, data ethics, security) and highlighting opportunities and threats.
  2. Culture Creating compelling peer, press and public communication and engagement that both address concerns and inspire people to engage in the solutions.
  3. Making Commissioning startups, running innovation competitions and programmes to create practice-based evidence that illustrates the challenges and business opportunities.
  4. Learning Creating training materials that aid implementation and defining evidence-based sustainable business models that are anchored around user-needs.
  5. Standards Defining common human and machine processes that enable both repeatability and scale within commercial and non-commercial environments.
  6. Infrastructure Defining and framing how people and machines will use data, algorithms and open APIs to create sustainable impact….(More)”.

Data in the EU: Commission steps up efforts to increase availability and boost healthcare data sharing


PressRelease: “Today, the European Commission is putting forward a set of measures to increase the availability of data in the EU, building on previous initiatives to boost the free flow of non-personal data in the Digital Single Market.

Data-driven innovation is a key enabler of market growth, job creation, particularly for SMEs and startups, and the development of new technologies. It allows citizens to easily access and manage their health data, and allows public authorities to use data better in research, prevention and health system reforms….

Today’s proposals build on the General Data Protection Regulation (GDPR), which will enter into application as of 25 May 2018. They will ensure:

  • Better access to and reusability of public sector data: A revised law on Public Sector Information covers data held by public undertakings in transport and utilities sectors. The new rules limit the exceptions that allow public bodies to charge more than the marginal costs of data dissemination for the reuse of their data. They also facilitate the reusability of open research data resulting from public funding, and oblige Member States to develop open access policies. Finally, the new rules require – where applicable – technical solutions like Application Programming Interfaces (APIs) to provide real-time access to data.
  • Scientific data sharing in 2018: new set of recommendations address the policy and technological changes since the last Commission proposal on access to and preservation of scientific information. They offer guidance on implementing open access policies in line with open science objectives, research data and data management, the creation of a European Open Science Cloud, and text and data-mining. They also highlight the importance of incentives, rewards, skills and metrics appropriate for the new era of networked research.
  • Private sector data sharing in business-to-business and business-to-governments contexts: A new Communication entitled “Towards a common European data space” provides guidance for businesses operating in the EU on the legal and technical principles that should govern data sharing collaboration in the private sector.
  • Securing citizens’ healthcare data while fostering European cooperation: The Commission is today setting out a plan of action that puts citizens first when it comes to data on citizens’ health: by securing citizens’ access to their health data and introducing the possibility to share their data across borders; by using larger data sets to enable more personalised diagnoses and medical treatment, and better anticipate epidemics; and by promoting appropriate digital tools, allowing public authorities to better use health data for research and for health system reforms. Today’s proposal also covers the interoperability of electronic health records as well as a mechanism for voluntary coordination in sharing data – including genomic data – for disease prevention and research….(More)”.

The Efficiency Paradox: What Big Data Can’t Do


Book by Edward Tenner: “A bold challenge to our obsession with efficiency–and a new understanding of how to benefit from the powerful potential of serendipity

Algorithms, multitasking, the sharing economy, life hacks: our culture can’t get enough of efficiency. One of the great promises of the Internet and big data revolutions is the idea that we can improve the processes and routines of our work and personal lives to get more done in less time than we ever have before. There is no doubt that we’re performing at higher levels and moving at unprecedented speed, but what if we’re headed in the wrong direction?

Melding the long-term history of technology with the latest headlines and findings of computer science and social science, The Efficiency Paradox questions our ingrained assumptions about efficiency, persuasively showing how relying on the algorithms of digital platforms can in fact lead to wasted efforts, missed opportunities, and above all an inability to break out of established patterns. Edward Tenner offers a smarter way of thinking about efficiency, revealing what we and our institutions, when equipped with an astute combination of artificial intelligence and trained intuition, can learn from the random and unexpected….(More)”