Proofreading of legal documents


 at Techcrunch: “.. jEugene…helps the drafters of legal documents catch mistakes that could be fatal to such documents’ validity or enforceability.

The original idea of Harry Zhou, who, as a first-year lawyer, was tasked with proofing a 250-page contract and wanted more than his supervising lawyer’s assurance that “you did great,” jEugene scans through a legal document and highlights in text potential drafting mistakes in the document.

The product is being used by White & Case LLP and is undergoing trial at Fenwick & West LLP. Tens of smaller law firms are accessing jEugene through Clio, a provider of cloud-based legal management software. Other clients are under NDA.

Errors that jEugene currently detects may seem innocuous at times, but could lead to hefty costs. For example, millions of dollars that certain creditors recently failed to recover in a famous bankruptcy case could have been avoided had jEugene been used; and jEugene’s analysis of legal documents disclosed on SEC EDGAR routinely reveals similar errors missed by some of the most sophisticated law firms (they say).

Here’s how it works: A user uploads a document, waits a few seconds, and downloads the resulting file. This emulates handwritten markups that lawyers are used to seeing, and highlights potential drafting mistakes in the document with different colors. The user then reviews the results to determine whether any revision is necessary….(More)”

White House debuts open source tool for visualizing government work across the country


Wired: “Data is immensely powerful. The trick lies in organizing the stuff. The good news is that so many organizations are now offering tools that help with this—and so many of these tools are open source.

The White House is among the many who are tapping into this trend. Today, the administration revealed a new tool meant to help anyone visualize government work across the country. Built in partnership with more than 15 Federal agencies, it’s basically a huge map of the country—with data layers you can select or deselect—that lets you see where certain community-based initiatives are gaining ground.

“This new approach focuses on the direction that cities and small towns want to go rather than the laundry list of programs the government has,” a representative from the White House Office of Management and Budget tells WIRED.

The initiatives include My Brother’s Keeper, a program designed to help residents succeed in education, in their careers, and beyond; Climate Action Champions, a program aimed to help local leaders address climate change issues; and Promise Zones, which hopes to increase economic security and expand educational opportunities within the community. The map also includes demographic information, on things like US Census data on counties of persistent poverty and data from Harvard about upward economic mobility by county.

“From the start, [the map] has been built in the open, and source code is available on GitHub,” the White House says, inviting data enthusiasts to make use of the map—which the administration also promised would get the benefit of regular data updates.”

Peacekeepers in the Sky: The Use of Unmanned Unarmed Aerial Vehicles for Peacekeeping


“The ICT4Peace Foundation is pleased to release ‘Peacekeepers in the Sky: The Use of Unmanned Unarmed Aerial Vehicles for Peacekeeping‘, authored by Helena Puig Larrauri and Patrick Meier.

As recently noted by Hervé Ladsous, the Under-Secretary-General for Peacekeeping Operations, the United Nations “cannot continue just using tools of 50 or 100 years ago.” The United Nations Department for Peacekeeping Operations (DPKO) is thus on course to create a “force for the future” by adopting and making increasing use of new technologies like Unarmed Unmanned Aerial Vehicles, also referred to as Unmanned Unarmed Aerial Vehicles (UUAVs). These remotely piloted aircraft systems, which are becoming increasingly cheaper, smarter and more robust, aim to provide peacekeeping missions with greater surveillance capabilities and thus more timely and enhanced situational awareness. This is expected to render peacekeeping missions more effective and cost-efficient in terms of keeping the peace and protecting civilians. According to DPKO’s vision, UUAVs “represent a new way of ‘seeing and knowing’ in peacekeeping and can dramatically improve peacekeepers’ access to information.” One strong proponent of UUAVs claims that they are a “major step forward towards much more discriminating use of violence in war and self-defense – a step forward in humanitarian technology.”

Yet the use of UUAVs is complicated by a number of issues related to perceptions, politics, ethics and empowerment. The use of surveillance technologies by the UN may at times be politically unpopular among those UN Member States that fear technologies like UUAVs will inevitably compromise their territorial and political sovereignty. In fact, arguments against the use of UUAVs sometimes resemble arguments against the “Responsibility to Protect” norm adopted by the UN General Assembly in 2005 as a framework for justifying military intervention as a last resort to protect civilians from mass atrocities. This admittedly narrow set of concerns is not always an issue; they can vary greatly based on the context of a given peacekeeping operation.

Download the full report here.”

Enabling the Data Revolution: An International Open Data Roadmap


IODC 2015: “The 3rd International Open Data Conference held in Ottawa, Canada May 28-29, 2015 was a great success. With over 1000 participants, 58 panels and workshops, ten parallel tracks, over 200 speakers, and more than 29 fringe events over 9 days, IODC was truly a global gathering.

We are excited to introduce the 3rd International Open Data Conference Final Report titled “Enabling the Data Revolution: An International Open Data Roadmap”.

This report draws upon the many discussions that took place in Ottawa at IODC, providing a summary of key topics and debates, and providing a shared vision of the road ahead for the IODC community. It is designed not as a single statement on open data, but rather as a curated record of discussions and debates, providing a snapshot of key issues and setting out a path forward based on the visions, ideas, and agreements explored at IODC.

Dissecting the Spirit of Gezi: Influence vs. Selection in the Occupy Gezi Movement


New study by Ceren Budak and Duncan J. Watts in Sociological Science: “Do social movements actively shape the opinions and attitudes of participants by bringing together diverse groups that subsequently influence one another? Ethnographic studies of the 2013 Gezi uprising seem to answer “yes,” pointing to solidarity among groups that were traditionally indifferent, or even hostile, to one another. We argue that two mechanisms with differing implications may generate this observed outcome: “influence” (change in attitude caused by interacting with other participants); and “selection” (individuals who participated in the movement were generally more supportive of other groups beforehand).

We tease out the relative importance of these mechanisms by constructing a panel of over 30,000 Twitter users and analyzing their support for the main Turkish opposition parties before, during, and after the movement. We find that although individuals changed in significant ways, becoming in general more supportive of the other opposition parties, those who participated in the movement were also significantly more supportive of the other parties all along. These findings suggest that both mechanisms were important, but that selection dominated. In addition to our substantive findings, our paper also makes a methodological contribution that we believe could be useful to studies of social movements and mass opinion change more generally. In contrast with traditional panel studies, which must be designed and implemented prior to the event of interest, our method relies on ex post panel construction, and hence can be used to study unanticipated or otherwise inaccessible events. We conclude that despite the well known limitations of social media, their “always on” nature and their widespread availability offer an important source of public opinion data….(More)”

The Merit Principle in Crisis


Commentary in Governance: “In the United States, the presidential race is heating up, and one result is an increasing number of assaults on century-old ideas about the merit-based civil service.  “The merit principle is under fierce attack,” says Donald Kettl, in a new commentary for Governance.  Kettl outlines five “tough questions” that are raised by attacks on the civil service system — and says that the US research community “has been largely asleep at the switch” on all of them.  Within major public policy schools, courses on the public service have been “pushed to the side.”  A century ago, American academics helped to build the American state.  Kettl warns that “scholarly neglect in the 2000s could undermine it.”  Read the commentary.

The Art of Managing Complex Collaborations


Eric Knight, Joel Cutcher-Gershenfeld, and Barbara Mittleman at MIT Sloan Management Review: “It’s not easy for stakeholders with widely varying interests to collaborate effectively in a consortium. The experience of the Biomarkers Consortium offers five lessons on how to successfully navigate the challenges that arise….

Society’s biggest challenges are also its most complex. From shared economic growth to personalized medicine to global climate change, few of our most pressing problems are likely to have simple solutions. Perhaps the only way to make progress on these and other challenges is by bringing together the important stakeholders on a given issue to pursue common interests and resolve points of conflict.

However, it is not easy to assemble such groups or to keep them together. Many initiatives have stumbled and disbanded. The Biomarkers Consortium might have been one of them, but this consortium beat the odds, in large part due to the founding parties’ determination to make it work. Nine years after it was founded, this public-private partnership, which is managed by the Foundation for the National Institutes of Health and based in Bethesda, Maryland, is still working to advance the availability of biomarkers (biological indicators for disease states) as tools for drug development, including applications at the frontiers of personalized medicine.

The Biomarkers Consortium’s mandate — to bring together, in the group’s words, “the expertise and resources of various partners to rapidly identify, develop, and qualify potential high-impact biomarkers particularly to enable improvements in drug development, clinical care, and regulatory decision-making” — may look simple. However, the reality has been quite complex. The negotiations that led to the consortium’s formation in 2006 were complicated, and the subsequent balancing of common and competing interests remains challenging….

Many in the biomedical sector had seen the need to tackle drug discovery costs for a long time, with multiple companies concurrently spending millions, sometimes billions, of dollars only to hit common dead ends in the drug development process. In 2004 and 2005, then National Institutes of Health director Elias Zerhouni convened key people from the U.S. Food and Drug Administration, the NIH, and the Pharmaceutical Research and Manufacturers of America to create a multistakeholder forum.

Every member knew from the outset that their fellow stakeholders represented many divergent and sometimes opposing interests: large pharmaceutical companies, smaller entrepreneurial biotechnology companies, FDA regulators, NIH science and policy experts, university researchers and nonprofit patient advocacy organizations….(More)”

On the morals of network research and beyond


Conspicuous Chatter:”…Discussion on ethics have become very popular in computer science lately — and to some extent I am glad about this. However, I think we should dispel three key fallacies.

The first one is that things we do not like (some may brand “immoral”) happen because others do not think of the moral implications of their actions. In fact it is entirely possible that they do and decide to act in a manner we do not like none-the-less. This could be out of conviction: those who built the surveillance equipment, that argue against strong encryption, and also those that do the torture and the killing (harm), may have entirely self-righteous ways of justifying their actions to themselves and others. Others, may simply be doing a good buck — and there are plenty of examples of this in the links above.

The second fallacy is that ethics, and research ethics more specifically, comes down to a “common sense” variant of “do no harm” — and that is that. In fact Ethics, as a philosophical discipline is extremely deep, and there are plenty of entirely legitimate ways to argue that doing harm is perfectly fine. If the authors of the paper were a bit more sophisticated in their philosophy they could, for example have made reference to the “doctrine of double effect” or the nature of free will of those that will bring actual harm to users, and therefore their moral responsibility. It seems that a key immoral aspect of this work was that the authors forgot to write that, confusing section.

Finally, we should dispel in conversations about research ethics, the myth that morality equals legality. The public review mentions “informed consent”, but in fact this is an extremely difficult notion — and legalistically it has been used to justify terrible things. The data protection variant of informed consent allows large internet companies, and telcos, to basically scoop most users’ data because of some small print in lengthy terms and conditions. In fact it should probably be our responsibility to highlight the immorality of this state of affairs, before writing public reviews about the immorality of a hypothetical censorship detection system.

Thus, I would argue, if one is to make an ethical point relating to the values and risks of technology they have to make it in the larger context of how technology is fielded and used, the politics around it, who has power, who makes the money, who does the torturing and the killing, and why. Technology lives within a big moral picture that a research community has a responsibility to comment on. Focusing moral attention on the microcosm of a specific hypothetical use case — just because it is the closest to our research community — misses the point, perpetuating silently a terrible state of moral affairs….(More)”

Innovative Study Supports Asteroid Initiative, Journey To Mars


David Steitz at NASA: “Innovation is a primary tool for problem solving at NASA. Whether creating new robotic spacecraft to explore asteroids or developing space habitats for our journey to Mars, innovative thinking is key to our success. NASA leads the federal government in cutting edge methods for conceptualizing and then executing America’s space exploration goals.

One example of NASA innovation is the agency’s work with the Expert and Citizen Assessment of Science and Technology (ECAST) Network. The ECAST group provided a citizen-focused, participatory technology assessment of NASA’s Asteroid Initiative, increasing public understanding of and engagement in the initiative while also providing the agency with new knowledge for use in planning our future missions.

“Participatory Exploration includes public engagement as we chart the course for future NASA activities, ranging from planetary defense to boots on Mars,” said Jason Kessler, program executive for NASA’s Asteroid Grand Challenge within the Office of the Chief Technologist at NASA Headquarters in Washington. “The innovative methodology for public engagement that the ECAST has given us opens new avenues for dialog directly with stakeholders across the nation, Americans who have and want to share their ideas with NASA on activities the agency is executing, now and in the future.”

In addition to formal “requests for information” or forums with industry for ideas, NASA employed ECAST to engage in a “participatory technology assessment,” an engagement model that seeks to improve the outcomes of science and technology decision-making through dialog with informed citizens. Participatory technology assessment involves engaging a group of non-experts who are representative of the general population but who—unlike political, academic, and industry stakeholders—who are often underrepresented in technology-related policymaking….(More)”

Inside the Nudge Unit: How small changes can make a big difference


Book by David Halpern: “Every day we make countless decisions, from the small, mundane things to tackling life’s big questions, but we don’t always make the right choices.

Behavioural scientist Dr David Halpern heads up Number 10’s ‘Nudge Unit’, the world’s first government institution that uses behavioural economics to examine and influence human behaviour, to ‘nudge’ us into making better decisions. Seemingly small and subtle solutions have led to huge improvements across tax, healthcare, pensions, employment, crime reduction, energy conservation and economic growth.

Adding a crucial line to a tax reminder brought forward millions in extra revenue; refocusing the questions asked at the job centre helped an extra 10 per cent of people come off their benefits and back into work; prompting people to become organ donors while paying for their car tax added an extra 100,000 donors to the register in a single year.

After two years and dozens of experiments in behavioural science, the results are undeniable. And now David Halpern and the Nudge Unit will help you to make better choices and improve your life…(More)”