New book by James Boyle & Jennifer Jenkins: “..This book, the first in a series of Duke Open Coursebooks, is available for free download under a Creative Commons license. It can also be purchased in a glossy paperback print edition for $29.99, $130 cheaper than other intellectual property casebooks.
This book is an introduction to intellectual property law, the set of private legal rights that allows individuals and corporations to control intangible creations and marks—from logos to novels to drug formulae—and the exceptions and limitations that define those rights. It focuses on the three main forms of US federal intellectual property—trademark, copyright and patent—but many of the ideas discussed here apply far beyond those legal areas and far beyond the law of the United States.
The book is intended to be a textbook for the basic Intellectual Property class, but because it is an open coursebook, which can be freely edited and customized, it is also suitable for an undergraduate class, or for a business, library studies, communications or other graduate school class. Each chapter contains cases and secondary readings and a set of problems or role-playing exercises involving the material. The problems range from a video of the Napster oral argument to counseling clients about search engines and trademarks, applying the First Amendment to digital rights management and copyright or commenting on the Supreme Court’s new rulings on gene patents.
Intellectual Property: Law & the Information Society is current as of August 2014. It includes discussions of such issues as the Redskins trademark cancelations, the Google Books case and the America Invents Act. Its illustrations range from graphs showing the growth in patent litigation to comic book images about copyright. The best way to get some sense of its coverage is to download it. In coming weeks, we will provide a separate fuller webpage with a table of contents and individual downloadable chapters.
The Center has also published an accompanying supplement of statutory and treaty materials that is available for free download and low cost print purchase.”
Google's fact-checking bots build vast knowledge bank
Hal Hodson in the New Scientist: “The search giant is automatically building Knowledge Vault, a massive database that could give us unprecedented access to the world’s facts
GOOGLE is building the largest store of knowledge in human history – and it’s doing so without any human help. Instead, Knowledge Vault autonomously gathers and merges information from across the web into a single base of facts about the world, and the people and objects in it.
The breadth and accuracy of this gathered knowledge is already becoming the foundation of systems that allow robots and smartphones to understand what people ask them. It promises to let Google answer questions like an oracle rather than a search engine, and even to turn a new lens on human history.
Knowledge Vault is a type of “knowledge base” – a system that stores information so that machines as well as people can read it. Where a database deals with numbers, a knowledge base deals with facts. When you type “Where was Madonna born” into Google, for example, the place given is pulled from Google’s existing knowledge base.
This existing base, called Knowledge Graph, relies on crowdsourcing to expand its information. But the firm noticed that growth was stalling; humans could only take it so far. So Google decided it needed to automate the process. It started building the Vault by using an algorithm to automatically pull in information from all over the web, using machine learning to turn the raw data into usable pieces of knowledge.
Knowledge Vault has pulled in 1.6 billion facts to date. Of these, 271 million are rated as “confident facts”, to which Google’s model ascribes a more than 90 per cent chance of being true. It does this by cross-referencing new facts with what it already knows.
“It’s a hugely impressive thing that they are pulling off,” says Fabian Suchanek, a data scientist at Télécom ParisTech in France.
Google’s Knowledge Graph is currently bigger than the Knowledge Vault, but it only includes manually integrated sources such as the CIA Factbook.
Knowledge Vault offers Google fast, automatic expansion of its knowledge – and it’s only going to get bigger. As well as the ability to analyse text on a webpage for facts to feed its knowledge base, Google can also peer under the surface of the web, hunting for hidden sources of data such as the figures that feed Amazon product pages, for example.
Tom Austin, a technology analyst at Gartner in Boston, says that the world’s biggest technology companies are racing to build similar vaults. “Google, Microsoft, Facebook, Amazon and IBM are all building them, and they’re tackling these enormous problems that we would never even have thought of trying 10 years ago,” he says.
The potential of a machine system that has the whole of human knowledge at its fingertips is huge. One of the first applications will be virtual personal assistants that go way beyond what Siri and Google Now are capable of, says Austin…”
Our future government will work more like Amazon
Michael Case in The Verge: “There is a lot of government in the United States. Several hundred federal agencies, 535 voting members in two houses of Congress, more than 90,000 state and local governments, and over 20 million Americans involved in public service.
But if the government is ever going to completely retool itself to provide sensible services to a growing, aging, diversifying American population, it will have to do more than bring in a couple innovators and throw data at the public. At the federal level, these kinds of adjustments will require new laws to change the way money is allocated to executive branch agencies so they can coordinate the purchase and development of a standard set of tools. State and local governments will have to agree on standard tools and data formats as well so that the mayor of Anchorage can collaborate with the governor of Delaware.
Technology is the answer to a lot of American government’s current operational shortcomings. Not only are the tools and systems most public servants use outdated and suboptimal, but the organizations and processes themselves have also calcified around similarly out-of-date thinking. So the real challenge won’t be designing cutting edge software or high tech government facilities — it’s going to be conjuring the will to overcome decades of old thinking. It’s going to be convincing over 90,000 employees to learn new skills, coaxing a bitterly divided Congress to collaborate on something scary, and finding a way to convince a timid and distracted White House to put its name on risky investments that won’t show benefits for many years.
But! If we can figure out a way for governments across the country to perform their basic functions and provide often life-saving services, maybe we can move on to chase even more elusive government tech unicorns. Imagine voting from your smartphone, having your taxes calculated and filed automatically with a few online confirmations, or filing for your retirement at a friendly tablet kiosk at your local government outpost. Government could — feasibly — be not only more effective, but also a pleasure to interact with someday. Someday.”
America in Decay
Francis Fukuyama in Foreign Affairs:”… Institutions are “stable, valued, recurring patterns of behaviour”, as Huntington put it, the most important function of which is to facilitate collective action. Without some set of clear and relatively stable rules, human beings would have to renegotiate their interactions at every turn. Such rules are often culturally determined and vary across different societies and eras, but the capacity to create and adhere to them is genetically hard-wired into the human brain. A natural tendency to conformism helps give institutions inertia and is what has allowed human societies to achieve levels of social cooperation unmatched by any other animal species.
The very stability of institutions, however, is also the source of political decay. Institutions are created to meet the demands of specific circumstances, but then circumstances change and institutions fail to adapt. One reason is cognitive: people develop mental models of how the world works and tend to stick to them, even in the face of contradictory evidence. Another reason is group interest: institutions create favored classes of insiders who develop a stake in the status quo and resist pressures to reform.
In theory, democracy, and particularly the Madisonian version of democracy that was enshrined in the US Constitution, should mitigate the problem of such insider capture by preventing the emergence of a dominant faction or elite that can use its political power to tyrannize over the country. It does so by spreading power among a series of competing branches of government and allowing for competition among different interests across a large and diverse country.
But Madisonian democracy frequently fails to perform as advertised. Elite insiders typically have superior access to power and information, which they use to protect their interests. Ordinary voters will not get angry at a corrupt politician if they don’t know that money is being stolen in the first place. Cognitive rigidities or beliefs may also prevent social groups from mobilizing in their own interests. For example, in the United States, many working-class voters support candidates promising to lower taxes on the wealthy, despite the fact that such tax cuts will arguably deprive them of important government services.
Furthermore, different groups have different abilities to organize to defend their interests. Sugar producers and corn growers are geographically concentrated and focused on the prices of their products, unlike ordinary consumers or taxpayers, who are dispersed and for whom the prices of these commodities are only a small part of their budgets. Given institutional rules that often favor special interests (such as the fact that Florida and Iowa, where sugar and corn are grown, are electoral swing states), those groups develop an outsized influence over agricultural and trade policy. Similarly, middle-class groups are usually much more willing and able to defend their interests, such as the preservation of the home mortgage tax deduction, than are the poor. This makes such universal entitlements as Social Security or health insurance much easier to defend politically than programs targeting the poor only.
Finally, liberal democracy is almost universally associated with market economies, which tend to produce winners and losers and amplify what James Madison termed the “different and unequal faculties of acquiring property.” This type of economic inequality is not in itself a bad thing, insofar as it stimulates innovation and growth and occurs under conditions of equal access to the economic system. It becomes highly problematic, however, when the economic winners seek to convert their wealth into unequal political influence. They can do so by bribing a legislator or a bureaucrat, that is, on a transactional basis, or, what is more damaging, by changing the institutional rules to favor themselves — for example, by closing off competition in markets they already dominate, tilting the playing field ever more steeply in their favor.
Political decay thus occurs when institutions fail to adapt to changing external circumstances, either out of intellectual rigidities or because of the power of incumbent elites to protect their positions and block change. Decay can afflict any type of political system, authoritarian or democratic. And while democratic political systems theoretically have self-correcting mechanisms that allow them to reform, they also open themselves up to decay by legitimating the activities of powerful interest groups that can block needed change.
This is precisely what has been happening in the United States in recent decades, as many of its political institutions have become increasingly dysfunctional. A combination of intellectual rigidity and the power of entrenched political actors is preventing those institutions from being reformed. And there is no guarantee that the situation will change much without a major shock to the political order….”
Twitter Analytics Project HealthMap Outperforming WHO in Ebola Tracking
HIS Talk: “HealthMap, a collaborative data analytics project launched in 2006 between Harvard Medical School and Boston Children’s Hospital, has been quietly tracking the recent Ebola outbreak in Western Africa with notable accuracy, beating the World Health Organization’s own tracking efforts by two weeks in some instances.
HealthMap aggregates information from a variety of online sources to plot real-time disease outbreaks. Currently, the platform analyzes data from the World Health Organization, Google News, and GeoSentinel, a global disease tracking platform that tracks major geography changes in diseases carried through travelers, foreign visitors, and immigrants. The analytics project also got a new source of feeder-data this February when Twitter announced that the HealthMap project had been selected as a Twitter Data Grant recipient, which gives the 45 epidemiologists working on the project access to the “fire hose” of unfiltered data generated from Twitter’s 500 million daily tweets….”
RegData
“RegData, developed by Patrick A. McLaughlin, Omar Al-Ubaydli, and the Mercatus Center at George Mason University, improves dramatically on the existing methods used to quantify regulations. Previous efforts to assess the extent of regulation in the United States have used imprecise variables such as the number of pages published in the Federal Register or the number of new rules created annually. However, not all regulations are equal in their effects on the economy or on different sectors of the economy. One page of regulatory text is often quite different from another page in content and consequence.
RegData improves upon existing metrics of regulation in three principal ways:
- RegData provides a novel measure that quantifies regulations based on the actual content of regulatory text. In other words, RegData examines the regulatory text itself, counting the number of binding constraints or “restrictions”—words that indicate an obligation to comply, such as “shall” or “must.” This is important because some regulatory programs can be hundreds of pages long with relatively few restrictions, while others only have a few paragraphs with a relatively high number of restrictions.
- RegData quantifies regulation by industry. It uses the same industry classes as the North American Industrial Classification System (NAICS), which categorizes and describes each industry in the US economy. Using industry-specific quantifications of regulation, users can examine the growth of regulation relevant to a particular industry over time or compare growth rates across industries.
There are several potential uses of a tool that measures regulation relevant to specific industries. Both the causes and consequences of regulation are likely to differ from one industry to the next, and by quantifying regulations for all industries, individuals can test whether industry characteristics, such as dynamism, unionization, or a penchant for lobbying, are correlated with industry-specific regulation levels.
For example, if someone wanted to know whether high unionization rates are correlated with heavy regulation, the person could compare RegData’s measure of industry-specific regulation for highly unionized industries to industries with little to no unionization. - *NEW* RegData 2.0 provides the user with the ability to quantify the regulation that specific federal regulators (including agencies, offices, bureaus, commissions, or administrations) have produced. For example, a user can now see how many restrictions a specific administration of the Department of Transportation (e.g., the National Highway Traffic Safety Administration) has produced in each year.”
Bloomberg Philanthropies Announces Major New Investment In City Halls' Capacity To Innovate
Press Release: “Bloomberg Philanthropies today announced a new $45 million investment to boost the capacity of city halls to use innovation to tackle major challenges and improve urban life. The foundation will direct significant funding and other assistance to help dozens of cities adopt the Innovation Delivery model, an approach to generating and implementing new ideas that has been tested and refined over the past three years in partnership with city leaders in Atlanta, Chicago, Louisville, Memphis, and New Orleans. …
Innovation Delivery Teams use best-in-class idea generation techniques with a structured, data-driven approach to delivering results. Operating as an in-house innovation consultancy, they have enabled mayors in the original five cities to produce clear results, such as:
- New Orleans reduced murder in 2013 by 19% compared to the previous year, resulting in the lowest number of murders in New Orleans since 1985.
- Memphis reduced retail vacancy rates by 30% along key commercial corridors.
- Louisville redirected 26% of low-severity 911 medical calls to a doctor’s office or immediate care center instead of requiring an ambulance trip to the emergency room.
- Chicago cut the licensing time for new restaurants by 33%; more than 1,000 new restaurants have opened since the Team began its work.
- Atlanta moved 1,022 chronically homeless individuals into permanent housing, quickly establishing itself as a national leader.
“Innovation Delivery has been an essential part of our effort to bring innovation, efficiency and improved services to our customers,” said Louisville Mayor Greg Fischer. “Philanthropy can play an important role in expanding the capacity of cities to deliver better, bolder results. Bloomberg Philanthropies is one of few foundations investing in this area, and it has truly been a game changer for our city.”
In addition to direct investments in cities, Bloomberg Philanthropies will fund technical assistance, research and evaluation, and partnerships with organizations to further spread the Innovation Delivery approach. The Innovation Delivery Playbook, which details the approach and some experiences of the original cities with which Bloomberg Philanthropies partnered, is available at: www.bloomberg.org …”
Technology’s Crucial Role in the Fight Against Hunger
Crowdsourcing, predictive analytics and other new tools could go far toward finding innovative solutions for America’s food insecurity.
National Geographic recently sent three photographers to explore hunger in the United States. It was an effort to give a face to a very troubling statistic: Even today, one-sixth of Americans do not have enough food to eat. Fifty million people in this country are “food insecure” — having to make daily trade-offs among paying for food, housing or medical care — and 17 million of them skip at least one meal a day to get by. When choosing what to eat, many of these individuals must make choices between lesser quantities of higher-quality food and larger quantities of less-nutritious processed foods, the consumption of which often leads to expensive health problems down the road.
This is an extremely serious, but not easily visible, social problem. Nor does the challenge it poses become any easier when poorly designed public-assistance programs continue to count the sauce on a pizza as a vegetable. The deficiencies caused by hunger increase the likelihood that a child will drop out of school, lowering her lifetime earning potential. In 2010 alone, food insecurity cost America $167.5 billion, a figure that includes lost economic productivity, avoidable health-care expenses and social-services programs.
As much as we need specific policy innovations, if we are to eliminate hunger in America food insecurity is just one of many extraordinarily complex and interdependent “systemic” problems facing us that would benefit from the application of technology, not just to identify innovative solutions but to implement them as well. In addition to laudable policy initiatives by such states as Illinois and Nevada, which have made hunger a priority, or Arkansas, which suffers the greatest level of food insecurity but which is making great strides at providing breakfast to schoolchildren, we can — we must — bring technology to bear to create a sustained conversation between government and citizens to engage more Americans in the fight against hunger.
Identifying who is genuinely in need cannot be done as well by a centralized government bureaucracy — even one with regional offices — as it can through a distributed network of individuals and organizations able to pinpoint with on-the-ground accuracy where the demand is greatest. Just as Ushahidi uses crowdsourcing to help locate and identify disaster victims, it should be possible to leverage the crowd to spot victims of hunger. As it stands, attempts to eradicate so-called food deserts are often built around developing solutions for residents rather than with residents. Strategies to date tend to focus on the introduction of new grocery stores or farmers’ markets but with little input from or involvement of the citizens actually affected.
Applying predictive analytics to newly available sources of public as well as private data, such as that regularly gathered by supermarkets and other vendors, could also make it easier to offer coupons and discounts to those most in need. In addition, analyzing nonprofits’ tax returns, which are legally open and available to all, could help map where the organizations serving those in need leave gaps that need to be closed by other efforts. The Governance Lab recently brought together U.S. Department of Agriculture officials with companies that use USDA data in an effort to focus on strategies supporting a White House initiative to use climate-change and other open data to improve food production.
Such innovative uses of technology, which put citizens at the center of the service-delivery process and streamline the delivery of government support, could also speed the delivery of benefits, thus reducing both costs and, every bit as important, the indignity of applying for assistance.
Being open to new and creative ideas from outside government through brainstorming and crowdsourcing exercises using social media can go beyond simply improving the quality of the services delivered. Some of these ideas, such as those arising from exciting new social-science experiments involving the use of incentives for “nudging” people to change their behaviors, might even lead them to purchase more healthful food.
Further, new kinds of public-private collaborative partnerships could create the means for people to produce their own food. Both new kinds of financing arrangements and new apps for managing the shared use of common real estate could make more community gardens possible. Similarly, with the kind of attention, convening and funding that government can bring to an issue, new neighbor-helping-neighbor programs — where, for example, people take turns shopping and cooking for one another to alleviate time away from work — could be scaled up.
Then, too, advances in citizen engagement and oversight could make it more difficult for lawmakers to cave to the pressures of lobbying groups that push for subsidies for those crops, such as white potatoes and corn, that result in our current large-scale reliance on less-nutritious foods. At the same time, citizen scientists reporting data through an app would be able do a much better job than government inspectors in reporting what is and is not working in local communities.
As a society, we may not yet be able to banish hunger entirely. But if we commit to using new technologies and mechanisms of citizen engagement widely and wisely, we could vastly reduce its power to do harm.
Better Governing Through Data
Editorial Board of the New York Times: “Government bureaucracies, as opposed to casual friendships, are seldom in danger from too much information. That is why a new initiative by the New York City comptroller, Scott Stringer, to use copious amounts of data to save money and solve problems, makes such intuitive sense.
Called ClaimStat, it seeks to collect and analyze information on the thousands of lawsuits and claims filed each year against the city. By identifying patterns in payouts and trouble-prone agencies and neighborhoods, the program is supposed to reduce the cost of claims the way CompStat, the fabled data-tracking program pioneered by the New York Police Department, reduces crime.
There is a great deal of money to be saved: In its 2015 budget, the city has set aside $674 million to cover settlements and judgments from lawsuits brought against it. That amount is projected to grow by the 2018 fiscal year to $782 million, which Mr. Stringer notes is more than the combined budgets of the Departments of Aging and Parks and Recreation and the Public Library.
The comptroller’s office issued a report last month that applied the ClaimStat approach to a handful of city agencies: the Police Department, Parks and Recreation, Health and Hospitals Corporation, Environmental Protection and Sanitation. It notes that the Police Department generates the most litigation of any city agency: 9,500 claims were filed against it in 2013, leading to settlements and judgments of $137.2 million.
After adjusting for the crime rate, the report found that several precincts in the South Bronx and Central Brooklyn had far more claims filed against their officers than other precincts in the city. What does that mean? It’s hard to know, but the implications for policy and police discipline would seem to be a challenge that the mayor, police commissioner and precinct commanders need to figure out. The data clearly point to a problem.
Far more obvious conclusions may be reached from ClaimStat data covering issues like park maintenance and sewer overflows. The city’s tree-pruning budget was cut sharply in 2010, and injury claims from fallen tree branches soared. Multimillion-dollar settlements ensued.
The great promise of ClaimStat is making such shortsightedness blindingly obvious. And in exposing problems like persistent flooding from sewer overflows, ClaimStat can pinpoint troubled areas down to the level of city blocks. (We’re looking at you, Canarsie, and Community District 2 on Staten Island.)
Mayor Bill de Blasio’s administration has offered only mild praise for the comptroller’s excellent idea (“the mayor welcomes all ideas to make the city more effective and better able to serve its citizens”) while noting, perhaps a little defensively, that it is already on top of this, at least where the police are concerned. It has created a “Risk Assessment and Compliance Unit” within the Police Department to examine claims and make recommendations. The mayor’s aides also point out that the city’s payouts have remained flat over the last 12 years, for which they credit a smart risk-assessment strategy that knows when to settle claims and when to fight back aggressively in court.
But the aspiration of a well-run city should not be to hold claims even but to shrink them. And, at a time when anecdotes and rampant theorizing are fueling furious debates over police crime-fighting strategies, it seems beyond arguing that the more actual information, independently examined and publicly available, the better.”
An Air-Quality Monitor You Take with You
MIT Technology Review: “A startup is building a wearable air-quality monitor using a sensing technology that can cheaply detect the presence of chemicals around you in real time. By reporting the information its sensors gather to an app on your smartphone, the technology could help people with respiratory conditions and those who live in highly polluted areas keep tabs on exposure.
Berkeley, California-based Chemisense also plans to crowdsource data from users to show places around town where certain compounds are identified.
Initially, the company plans to sell a $150 wristband geared toward kids with asthma—of which there are nearly 7 million in the U.S., according to data from the Centers for Disease Control and Prevention— to help them identify places and pollutants that tend to provoke attacks, and track their exposure to air pollution over time. The company hopes people with other respiratory conditions, and those who are just concerned about air pollution, will be interested, too.
In the U.S., air quality is monitored at thousands of stations across the country; maps and forecasts can be viewed online. But these monitors offer accurate readings only in their location.
Chemisense has not yet made its initial product, but it expects it will be a wristband using polymers treated with charged nanoparticles of carbon such that the polymers swell in the presence of certain chemical vapors, changing the resistance of a circuit.”