The Art of Managing Complex Collaborations


Eric Knight, Joel Cutcher-Gershenfeld, and Barbara Mittleman at MIT Sloan Management Review: “It’s not easy for stakeholders with widely varying interests to collaborate effectively in a consortium. The experience of the Biomarkers Consortium offers five lessons on how to successfully navigate the challenges that arise….

Society’s biggest challenges are also its most complex. From shared economic growth to personalized medicine to global climate change, few of our most pressing problems are likely to have simple solutions. Perhaps the only way to make progress on these and other challenges is by bringing together the important stakeholders on a given issue to pursue common interests and resolve points of conflict.

However, it is not easy to assemble such groups or to keep them together. Many initiatives have stumbled and disbanded. The Biomarkers Consortium might have been one of them, but this consortium beat the odds, in large part due to the founding parties’ determination to make it work. Nine years after it was founded, this public-private partnership, which is managed by the Foundation for the National Institutes of Health and based in Bethesda, Maryland, is still working to advance the availability of biomarkers (biological indicators for disease states) as tools for drug development, including applications at the frontiers of personalized medicine.

The Biomarkers Consortium’s mandate — to bring together, in the group’s words, “the expertise and resources of various partners to rapidly identify, develop, and qualify potential high-impact biomarkers particularly to enable improvements in drug development, clinical care, and regulatory decision-making” — may look simple. However, the reality has been quite complex. The negotiations that led to the consortium’s formation in 2006 were complicated, and the subsequent balancing of common and competing interests remains challenging….

Many in the biomedical sector had seen the need to tackle drug discovery costs for a long time, with multiple companies concurrently spending millions, sometimes billions, of dollars only to hit common dead ends in the drug development process. In 2004 and 2005, then National Institutes of Health director Elias Zerhouni convened key people from the U.S. Food and Drug Administration, the NIH, and the Pharmaceutical Research and Manufacturers of America to create a multistakeholder forum.

Every member knew from the outset that their fellow stakeholders represented many divergent and sometimes opposing interests: large pharmaceutical companies, smaller entrepreneurial biotechnology companies, FDA regulators, NIH science and policy experts, university researchers and nonprofit patient advocacy organizations….(More)”

On the morals of network research and beyond


Conspicuous Chatter:”…Discussion on ethics have become very popular in computer science lately — and to some extent I am glad about this. However, I think we should dispel three key fallacies.

The first one is that things we do not like (some may brand “immoral”) happen because others do not think of the moral implications of their actions. In fact it is entirely possible that they do and decide to act in a manner we do not like none-the-less. This could be out of conviction: those who built the surveillance equipment, that argue against strong encryption, and also those that do the torture and the killing (harm), may have entirely self-righteous ways of justifying their actions to themselves and others. Others, may simply be doing a good buck — and there are plenty of examples of this in the links above.

The second fallacy is that ethics, and research ethics more specifically, comes down to a “common sense” variant of “do no harm” — and that is that. In fact Ethics, as a philosophical discipline is extremely deep, and there are plenty of entirely legitimate ways to argue that doing harm is perfectly fine. If the authors of the paper were a bit more sophisticated in their philosophy they could, for example have made reference to the “doctrine of double effect” or the nature of free will of those that will bring actual harm to users, and therefore their moral responsibility. It seems that a key immoral aspect of this work was that the authors forgot to write that, confusing section.

Finally, we should dispel in conversations about research ethics, the myth that morality equals legality. The public review mentions “informed consent”, but in fact this is an extremely difficult notion — and legalistically it has been used to justify terrible things. The data protection variant of informed consent allows large internet companies, and telcos, to basically scoop most users’ data because of some small print in lengthy terms and conditions. In fact it should probably be our responsibility to highlight the immorality of this state of affairs, before writing public reviews about the immorality of a hypothetical censorship detection system.

Thus, I would argue, if one is to make an ethical point relating to the values and risks of technology they have to make it in the larger context of how technology is fielded and used, the politics around it, who has power, who makes the money, who does the torturing and the killing, and why. Technology lives within a big moral picture that a research community has a responsibility to comment on. Focusing moral attention on the microcosm of a specific hypothetical use case — just because it is the closest to our research community — misses the point, perpetuating silently a terrible state of moral affairs….(More)”

Innovative Study Supports Asteroid Initiative, Journey To Mars


David Steitz at NASA: “Innovation is a primary tool for problem solving at NASA. Whether creating new robotic spacecraft to explore asteroids or developing space habitats for our journey to Mars, innovative thinking is key to our success. NASA leads the federal government in cutting edge methods for conceptualizing and then executing America’s space exploration goals.

One example of NASA innovation is the agency’s work with the Expert and Citizen Assessment of Science and Technology (ECAST) Network. The ECAST group provided a citizen-focused, participatory technology assessment of NASA’s Asteroid Initiative, increasing public understanding of and engagement in the initiative while also providing the agency with new knowledge for use in planning our future missions.

“Participatory Exploration includes public engagement as we chart the course for future NASA activities, ranging from planetary defense to boots on Mars,” said Jason Kessler, program executive for NASA’s Asteroid Grand Challenge within the Office of the Chief Technologist at NASA Headquarters in Washington. “The innovative methodology for public engagement that the ECAST has given us opens new avenues for dialog directly with stakeholders across the nation, Americans who have and want to share their ideas with NASA on activities the agency is executing, now and in the future.”

In addition to formal “requests for information” or forums with industry for ideas, NASA employed ECAST to engage in a “participatory technology assessment,” an engagement model that seeks to improve the outcomes of science and technology decision-making through dialog with informed citizens. Participatory technology assessment involves engaging a group of non-experts who are representative of the general population but who—unlike political, academic, and industry stakeholders—who are often underrepresented in technology-related policymaking….(More)”

Inside the Nudge Unit: How small changes can make a big difference


Book by David Halpern: “Every day we make countless decisions, from the small, mundane things to tackling life’s big questions, but we don’t always make the right choices.

Behavioural scientist Dr David Halpern heads up Number 10’s ‘Nudge Unit’, the world’s first government institution that uses behavioural economics to examine and influence human behaviour, to ‘nudge’ us into making better decisions. Seemingly small and subtle solutions have led to huge improvements across tax, healthcare, pensions, employment, crime reduction, energy conservation and economic growth.

Adding a crucial line to a tax reminder brought forward millions in extra revenue; refocusing the questions asked at the job centre helped an extra 10 per cent of people come off their benefits and back into work; prompting people to become organ donors while paying for their car tax added an extra 100,000 donors to the register in a single year.

After two years and dozens of experiments in behavioural science, the results are undeniable. And now David Halpern and the Nudge Unit will help you to make better choices and improve your life…(More)”

Beyond the Jailhouse Cell: How Data Can Inform Fairer Justice Policies


Alexis Farmer at DataDrivenDetroit: “Government-provided open data is a value-added approach to providing transparency, analytic insights for government efficiency, innovative solutions for products and services, and increased civic participation. Two of the least transparent public institutions are jails and prisons. The majority of population has limited knowledge about jail and prison operations and the demographics of the jail and prison population, even though the costs of incarceration are substantial. The absence of public knowledge about one of the many establishments public tax dollars support can be resolved with an open data approach to criminal justice. Increasing access to administrative jail information enables communities to collectively and effectively find solutions to the challenges the system faces….

The data analysis that compliments open data practices is a part of the formula for creating transformational policies. There are numerous ways that recording and publishing data about jail operations can inform better policies and practices:

1. Better budgeting and allocation of funds. By monitoring the rate at which dollars are expended for a specific function, data allows for administrations to ensure accurate estimates of future expenditures.

2. More effective deployment of staff. Knowing the average daily population and annual average bookings can help inform staffing decisions to determine a total need of officers, shift responsibilities, and room arrangements. The population information also helps with facility planning, reducing overcrowding, controlling violence within the facility, staffing, determining appropriate programs and services, and policy and procedure development.

3. Program participation and effectiveness. Gauging the amount of inmates involved in jail work programs, educational training services, rehabilitation/detox programs, and the like is critical to evaluating methods to improve and expand such services. Quantifying participation and effectiveness of these programs can potentially lead to a shift in jail rehabilitating services.

4. Jail suicides. “The rate of jail suicides is about three times the rate of prison suicides.” Jails are isolating spaces that separate inmates from social support networks, diminish personal control, and often lack mental health resources. Most people in jail face minor charges and spend less time incarcerated due to shorter sentences. Reviewing the previous jail suicide statistics aids in pinpointing suicide risk, identifying high-risk groups, and ultimately, prescribing intervention procedures and best practices to end jail suicides.

5. Gender and race inequities. It is well known that Black men are disproportionately incarcerated, and the number of Black women in jails and prisons has rapidly increased . It is important to view this disparity as it reflects to the demographics of the total population of an area. Providing data that show trends in particular crimes committed by race and gender data might lead to further analysis and policy changes in the root causes of these crimes (poverty, employment, education, housing, etc.).

6. Prior interaction with the juvenile justice system. The school-to-prison pipeline describes the systematic school discipline policies that increase a student’s interaction with the juvenile justice system. Knowing how many incarcerated persons that have been suspended, expelled, or incarcerated as a juvenile can encourage schools to examine their discipline policies and institute more restorative justice programs for students. It would also encourage transitional programs for formerly incarcerated youth in order to decrease recidivism rate among young people.

7. Sentencing reforms. Evaluating the charges on which a person is arrested, the length of stay, average length of sentences, charges for which sentences are given, and the length of time from the first appearance to arraignment and trial disposition can inform more just and balanced sentencing laws enforced by the judicial branch….(More)”

Customer-Driven Government


Jane Wiseman at DataSmart City Solutions: “Public trust in government is low — of the 43 industries tracked in the American Customer Satisfaction Index, only one ranks lower than the federal government in satisfaction levels.  Local government ranks a bit higher than the federal government, but for most of the public, that makes little difference. It’s time for government to change that perception by listening to its customers and improving service delivery.

What can the cup holder in your car teach government about customer engagement? A cup holder would be hard to live without — it keeps a latte from spilling and has room for keys and a phone. But the cup holder was not always such a multi-tasker. The first ones were shallow indentations in the plastic on the inside of the glove box. Accelerate and the drinks went flying. Did a brilliant automotive engineer decide that was a design flaw and fix it? No. It was only when Chrysler received more complaints about the cup holder than about anything else in their cars that they were forced to innovate. Don Clark, a DaimlerChrysler engineer known as the “Cup Holder King,” designed the first of the modern cup holders, debuting in the company’s 1984 minivans. The engineersthought they knew what their customers wanted (more powerful engines, better fuel economy, safety features) but it wasn’t until they listened to customers’ comments that they put in the cup holder. And sales took off.

Today, we’re awash in customer feedback, seemingly everywhere but government.  Over the past decade, customer feedback ratings for products and services have shown up everywhere — whether in a review on Yelp, a “like” on Facebook, or a Tweet about the virtues or shortcomings of a product or service.  Ratings help draw attention to poor quality and allow companies to address these gaps.  Many companies routinely follow up a customer interaction with a satisfaction survey.  This data drives improvement efforts aimed at keeping customers happy.  Some companies aggressively manage their online reviews, seeking to increase their NPS, or net promoter score.  Many people really like to provide feedback — there are 77 million reviews on Yelp to date, according to the company.  Imagine the power of that many reviews of government service.

If customer input can influence the automotive industry, and can help consumers make better decisions, what if we turned this energy toward government?  After all, the government is run “by the people” and “for the people” — what if citizens gave government real-time guidance on improving services?  And could leaders in government ask customers what they want, instead of presuming to know?  This paper explores these questions and suggests a way forward.

….

If I were a mayor, how would I begin harnessing customer feedback to improve service delivery?  I would build a foundation for improving core city operations (trash pickup, pothole fixing, etc.) by using the same three questions Kansas City uses for follow-up surveys to all who contact 311.  Upon that foundation I would layer additional outreach on a tactical, ad hoc basis.  I would experiment with the growing body of tools for engaging the public in shaping tactical decisions, such as how to allocate capital projects and where to locate bike share hubs.

To get an even deeper insight into the customer experience, I might copy what Somerville, MA has done with its Secret Resident program.  Trained volunteers assess the efficiency, courtesy, and ease of use of selected city departments.  The volunteers transact typical city services by phone or in person, and then document their customer experience.  They rate the agencies, and the 311 call center, and provide assessments that can help improve customer service.

By listening to and leveraging data on constituent calls for service, government can move from a culture of reaction to a proactive culture of listening and learning from the data provided by the public.  Engaging the public, and following through on the suggestions they give, can increase not only the quality of government service, but the faith of the public that government can listen and respond.

Every leader in government should commit to getting feedback from customers — it’s the only way to know how to increase their satisfaction with the services.  There is no more urgent time to improve the customer experience…(More)

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots


Book description: “Robots are poised to transform today’s society as completely as the Internet did twenty years ago. Pulitzer prize-winning New York Times science writer John Markoff argues that we must decide to design ourselves into our future, or risk being excluded from it altogether.

In the past decade, Google introduced us to driverless cars; Apple debuted Siri, a personal assistant that we keep in our pockets; and an Internet of Things connected the smaller tasks of everyday life to the farthest reaches of the Web. Robots have become an integral part of society on the battlefield and the road; in business, education, and health care. Cheap sensors and powerful computers will ensure that in the coming years, these robots will act on their own. This new era offers the promise of immensely powerful machines, but it also reframes a question first raised more than half a century ago, when the intelligent machine was born. Will we control these systems, or will they control us?

In Machines of Loving Grace, John Markoff offers a sweeping history of the complicated and evolving relationship between humans and computers. In recent years, the pace of technological change has accelerated dramatically, posing an ethical quandary. If humans delegate decisions to machines, who will be responsible for the consequences? As Markoff chronicles the history of automation, from the birth of the artificial intelligence and intelligence augmentation communities in the 1950s and 1960s, to the modern-day brain trusts at Google and Apple in Silicon Valley, and on to the expanding robotics economy around Boston, he traces the different ways developers have addressed this fundamental problem and urges them to carefully consider the consequences of their work. We are on the brink of the next stage of the computer revolution, Markoff argues, and robots will profoundly transform modern life. Yet it remains for us to determine whether this new world will be a utopia. Moreover, it is now incumbent upon the designers of these robots to draw a bright line between what is human and what is machine.

After nearly forty years covering the tech industry, Markoff offers an unmatched perspective on the most drastic technology-driven societal shifts since the introduction of the Internet. Machines of Loving Grace draws on an extensive array of research and interviews to present an eye-opening history of one of the most pressing questions of our time, and urges us to remember that we still have the opportunity to design ourselves into the future—before it’s too late….(More)”

2015 Hype Cycle for Emerging Technologies


Gartner: “The journey to digital business continues as the key theme of Gartner, Inc.’s “Hype Cycle for Emerging Technologies, 2015.” New to the Hype Cycle this year is the emergence of technologies that support what Gartner defines as digital humanism — the notion that people are the central focus in the manifestation ofdigital businesses and digital workplaces.

The Hype Cycle for Emerging Technologies report is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that business strategists, chief innovation officers, R&D leaders, entrepreneurs, global market developers and emerging-technology teams should consider in developing emerging-technology portfolios.

“The Hype Cycle for Emerging Technologies is the broadest aggregate Gartner Hype Cycle, featuring technologies that are the focus of attention because of particularly high levels of interest, and those that Gartner believes have the potential for significant impact,” said Betsy Burton, vice president and distinguished analyst at Gartner. “This year, we encourage CIOs and other IT leaders to dedicate time and energy focused on innovation, rather than just incremental business advancement, while also gaining inspiration by scanning beyond the bounds of their industry.”

Major changes in the 2015 Hype Cycle for Emerging Technologies (see Figure 1) include the placement ofautonomous vehicles, which have shifted from pre-peak to peak of the Hype Cycle. While autonomous vehicles are still embryonic, this movement still represents a significant advancement, with all major automotive companies putting autonomous vehicles on their near-term roadmaps. Similarly, the growing momentum (from post-trigger to pre-peak) in connected-home solutions has introduced entirely new solutions and platforms enabled by new technology providers and existing manufacturers.

Figure 1. Hype Cycle for Emerging Technologies, 2015

Source: Gartner (August 2015)

“As enterprises continue the journey to becoming digital businesses, identifying and employing the right technologies at the right time will be critical,” said Ms. Burton. “As we have set out on the Gartner roadmap to digital business, there are six progressive business era models that enterprises can identify with today and to which they can aspire in the future….(More)”

Anonymization and Risk


Paper by Ira Rubinstein and Woodrow Hartzog: “Perfect anonymization of data sets has failed. But the process of protecting data subjects in shared information remains integral to privacy practice and policy. While the deidentification debate has been vigorous and productive, there is no clear direction for policy. As a result, the law has been slow to adapt a holistic approach to protecting data subjects when data sets are released to others. Currently, the law is focused on whether an individual can be identified within a given set. We argue that the better locus of data release policy is on the process of minimizing the risk of reidentification and sensitive attribute disclosure. Process-based data release policy, which resembles the law of data security, will help us move past the limitations of focusing on whether data sets have been “anonymized.” It draws upon different tactics to protect the privacy of data subjects, including accurate deidentification rhetoric, contracts prohibiting reidentification and sensitive attribute disclosure, data enclaves, and query-based strategies to match required protections with the level of risk. By focusing on process, data release policy can better balance privacy and utility where nearly all data exchanges carry some risk….(More)”

How Africa can benefit from the data revolution


 in The Guardian: “….The modern information infrastructure is about movement of data. From data we derive information and knowledge, and that knowledge can be propagated rapidly across the country and throughout the world. Facebook and Google have both made massive investments in machine learning, the mainstay technology for converting data into knowledge. But the potential for these technologies in Africa is much larger: instead of simply advertising products to people, we can imagine modern distributed health systems, distributed markets, knowledge systems for disease intervention. The modern infrastructure should be data driven and deployed across the mobile network. A single good idea can then be rapidly implemented and distributed via the mobile phone app ecosystems.

The information infrastructure does not require large scale thinking and investment to deliver. In fact, it requires just the reverse. It requires agility and innovation. Larger companies cannot react quickly enough to exploit technological advances. Small companies with a good idea can grow quickly. From IBM to Microsoft, Google and now Facebook. All these companies now agree on one thing: data is where the value lies. Modern internet companies are data-driven from the ground up. Could the same thing happen in Africa’s economies? Can entire countries reformulate their infrastructures to be data-driven from the ground up?

Maybe, or maybe not, but it isn’t necessary to have a grand plan to give it a go. It is already natural to use data and communication to solve real world problems. In Silicon Valley these are the challenges of getting a taxi or reserving a restaurant. In Africa they are often more fundamental. John Quinn has been in Kampala, Uganda at Makerere University for eight years now targeting these challenges. In June this year, John and other researchers from across the region came together for Africa’s first workshop on data science at Dedan Kimathi University of Technology. The objective was to spread knowledge of technologies, ideas and solutions. For the modern information infrastructure to be successful software solutions need to be locally generated. African apps to solve African problems. With this in mind the workshop began with a three day summer school on data science which was then followed by two days of talks on challenges in African data science.

The ideas and solutions presented were cutting edge. The Umati project uses social media to understand the use of ethnic hate speech in Kenya (Sidney Ochieng, iHub, Nairobi). The use of social media for monitoring the evolution and effects of Ebola in west Africa (Nuri Pashwani, IBM Research Africa). The Kudusystem for market making in Ugandan farm produce distribution via SMS messages (Kenneth Bwire, Makerere University, Kampala). Telecommunications data for inferring the source and spread of a typhoid outbreak in Kampala (UN Pulse Lab, Kampala). The Punya system for prototyping and deployment of mobile phone apps to deal with emerging crises or market opportunities (Julius Adebayor, MIT) and large scale systems for collating and sharing data resources Open Data Kenya and UN OCHA Human Data Exchange….(More)”