Digital Humanitarians


New book by Patrick Meier on how big data is changing humanitarian response: “The overflow of information generated during disasters can be as paralyzing to humanitarian response as the lack of information. This flash flood of information when amplified by social media and satellite imagery is increasingly referred to as Big Data—or Big Crisis Data. Making sense of Big Crisis Data during disasters is proving an impossible challenge for traditional humanitarian organizations, which explains why they’re increasingly turning to Digital Humanitarians.
Who exactly are these Digital Humanitarians? They’re you, me, all of us. Digital Humanitarians are volunteers and professionals from the world over and from all walks of life. What do they share in common? The desire to make a difference, and they do that by rapidly mobilizing online in collaboration with international humanitarian organizations. They make sense of vast volumes of social media and satellite imagery in virtually real-time to support relief efforts worldwide. How? They craft and leverage ingenious crowdsourcing solutions with trail-blazing insights from artificial intelligence.
In sum, this book charts the sudden and spectacular rise of Digital Humanitarians by sharing their remarkable, real-life stories, highlighting how their humanity coupled with innovative solutions to Big Data is changing humanitarian response forever. Digital Humanitarians will make you think differently about what it means to be humanitarian and will invite you to join the journey online.
Clicker here to be notified when the book becomes available. For speaking requests, please email Speaking@iRevolution.net.”

Can Government Play Moneyball?


David Bornstein in the New York Times: “…For all the attention it’s getting inside the administration, evidence-based policy-making seems unlikely to become a headline grabber; it lacks emotional appeal. But it does have intellectual heft. And one group that has been doing creative work to give the message broader appeal is Results for America, which has produced useful teaching aids under the banner “Moneyball for Government,” building on the popularity of the book and movie about Billy Beane’s Oakland A’s, and the rise of data-driven decision making in major league baseball. (Watch their video explainers here and here.)
Results for America works closely with leaders across political parties and social sectors, to build awareness about evidence-based policy making — drawing attention to key areas where government could dramatically improve people’s lives by augmenting well-tested models. They are also chronicling efforts by local governments around the country, to show how an emerging group of “Geek Cities,” including Baltimore, Denver, Miami, New York, Providence and San Antonio, are using data and evidence to drive improvements in various areas of social policy like education, youth development and employment.
“It seems like common sense to use evidence about what works to get better results,” said Michele Jolin, Results for America’s managing partner. “How could anyone be against it? But the way our system is set up, there are so many loud voices pushing to have dollars spent and policy shaped in the way that works for them. There has been no organized constituency for things that work.”
“The debate in Washington is usually about the quantity of resources,” said David Medina, a partner in Results for America. “We’re trying to bring it back to talking about quality.”
Not everyone will find this change appealing. “When you have a longstanding social service policy, there’s going to be a network of [people and groups] who are organized to keep that money flowing regardless of whether evidence suggests it’s warranted,” said Daniel Stid. “People in social services don’t like to think they’re behaving like other organized interests — like dairy farmers or mortgage brokers — but it leads to tremendous inertia in public policy.”
Beyond the politics, there are practical obstacles to overcome, too. Federal agencies lack sufficient budgets for evaluation or a common definition for what constitutes rigorous evidence. (Any lobbyist can walk into a legislator’s office and claim to have solid data to support an argument.) Up-to-date evidence also needs to be packaged in accessible ways and made available on a timely basis, so it can be used to improve programs, rather than to threaten them. Governments need to build regular evaluations into everything they do — not just conduct big, expensive studies every 10 years or so.
That means developing new ways to conduct quick and inexpensive randomized studies using data that is readily available, said Haskins, who is investigating this approach. “We should be running 10,000 evaluations a year, like they do in medicine.” That’s the only way to produce the rapid trial-and-error learning needed to drive iterative program improvements, he added. (I reported on a similar effort being undertaken by the Coalition for Evidence-Based Policy.)
Results for America has developed a scorecard to rank federal departments about how prepared they are to produce or incorporate evidence in their programs. It looks at whether a department has an office and a leader with the authority and budget to evaluate its programs. It asks: Does it make its data accessible to the public? Does it compile standards about what works and share them widely? Does it spend at least 1 percent of its budget evaluating its programs? And — most important — does it incorporate evidence in its big grant programs? For now, the Department of Education gets the top score.
The stakes are high. In 2011, for example, the Obama administration launched a process to reform Head Start, doing things like spreading best practices and forcing the worst programs to improve or lose their funding. This February, for the third time, the government released a list of Head Start providers (103 out of about 1,600) who will have to recompete for federal funding because of performance problems. That list represents tens of thousands of preschoolers, many of whom are missing out on the education they need to succeed in kindergarten — and life.
Improving flagship programs like Head Start, and others, is not just vital for the families they serve; it’s vital to restore trust in government. “I am a card-carrying member of the Republican Party and I want us to be governed well,” said Robert Shea, who pushed for better program evaluations as associate director of the Office of Management and Budget during the Bush administration, and continues to focus on this issue as chairman of the National Academy of Public Administration. “This is the most promising thing I know of to get us closer to that goal.”
“This idea has the prospect of uniting Democrats and Republicans,” said Haskins. “But it will involve a broad cultural change. It has to get down to the program administrators, board members and local staff throughout the country — so they know that evaluation is crucial to their operations.”
“There’s a deep mistrust of government and a belief that problems can’t be solved,” said Michele Jolin. “This movement will lead to better outcomes — and it will help people regain confidence in their public officials by creating a more effective, more credible way for policy choices to be made.”

Paying Farmers to Welcome Birds


Jim Robbins in The New York Times: “The Central Valley was once one of North America’s most productive wildlife habitats, a 450-mile-long expanse marbled with meandering streams and lush wetlands that provided an ideal stop for migratory shorebirds on their annual journeys from South America and Mexico to the Arctic and back.

Farmers and engineers have long since tamed the valley. Of the wetlands that existed before the valley was settled, about 95 percent are gone, and the number of migratory birds has declined drastically. But now an unusual alliance of conservationists, bird watchers and farmers have joined in an innovative plan to restore essential habitat for the migrating birds.

The program, called BirdReturns, starts with data from eBird, the pioneering citizen science project that asks birders to record sightings on a smartphone app and send the information to the Cornell Lab of Ornithology in upstate New York.

By crunching data from the Central Valley, eBird can generate maps showing where virtually every species congregates in the remaining wetlands. Then, by overlaying those maps on aerial views of existing surface water, it can determine where the birds’ need for habitat is greatest….

BirdReturns is an example of the growing movement called reconciliation ecology, in which ecosystems dominated by humans are managed to increase biodiversity.

“It’s a new ‘Moneyball,’ ” said Eric Hallstein, an economist with the Nature Conservancy and a designer of the auctions, referring to the book and movie about the Oakland Athletics’ data-driven approach to baseball. “We’re disrupting the conservation industry by taking a new kind of data, crunching it differently and contracting differently.”

The Persistence of Innovation in Government


New Book: “In The Persistence of Innovation in Government, Sanford Borins maps the changing landscape of American public sector innovation in the twenty-first century, largely addressing three key questions:

  • Who innovates?
  • When, why, and how do they do it?
  • What are the persistent obstacles and the proven methods for overcoming them?

Probing both the process and the content of innovation in the public sector, Borins identifies major shifts and important continuities. His examination of public innovation combines several elements: his analysis of the Harvard Kennedy School’s Innovations in American Government Awards program; significant new research on government performance; and a fresh look at the findings of his earlier, highly praised book Innovating with Integrity: How Local Heros Are Transforming American Government. Ho also offers a thematic survey of the field’s burgeoning literature, with a particular focus on international comparison.”

Book Review: 'The Rule of Nobody' by Philip K. Howard


Stuart Taylor Jr in the Wall Street Journal: “Amid the liberal-conservative ideological clash that paralyzes our government, it’s always refreshing to encounter the views of Philip K. Howard, whose ideology is common sense spiked with a sense of urgency. In “The Rule of Nobody,” Mr. Howard shows how federal, state and local laws and regulations have programmed officials of both parties to follow rules so detailed, rigid and, often, obsolete as to leave little room for human judgment. He argues passionately that we will never solve our social problems until we abandon what he calls a misguided legal philosophy of seeking to put government on regulatory autopilot. He also predicts that our legal-governmental structure is “headed toward a stall and then a frightening plummet toward insolvency and political chaos.”
Mr. Howard, a big-firm lawyer who heads the nonpartisan government-reform coalition Common Good, is no conventional deregulator. But he warns that the “cumulative complexity” of the dense rulebooks that prescribe “every nuance of how law is implemented” leaves good officials without the freedom to do what makes sense on the ground. Stripped of the authority that they should have, he adds, officials have little accountability for bad results. More broadly, he argues that the very structure of our democracy is so clogged by deep thickets of dysfunctional law that it will only get worse unless conservatives and liberals alike cast off their distrust of human discretion.
The rulebooks should be “radically simplified,” Mr. Howard says, on matters ranging from enforcing school discipline to protecting nursing-home residents, from operating safe soup kitchens to building the nation’s infrastructure: Projects now often require multi-year, 5,000-page environmental impact statements before anything can begin to be constructed. Unduly detailed rules should be replaced by general principles, he says, that take their meaning from society’s norms and values and embrace the need for official discretion and responsibility.
Mr. Howard serves up a rich menu of anecdotes, including both the small-scale activities of a neighborhood and the vast administrative structures that govern national life. After a tree fell into a stream and caused flooding during a winter storm, Franklin Township, N.J., was barred from pulling the tree out until it had spent 12 days and $12,000 for the permits and engineering work that a state environmental rule required for altering any natural condition in a “C-1 stream.” The “Volcker Rule,” designed to prevent banks from using federally insured deposits to speculate in securities, was shaped by five federal agencies and countless banking lobbyists into 963 “almost unintelligible” pages. In New York City, “disciplining a student potentially requires 66 separate steps, including several levels of potential appeals”; meanwhile, civil-service rules make it virtually impossible to terminate thousands of incompetent employees. Children’s lemonade stands in several states have been closed down for lack of a vendor’s license.

 

Conservatives as well as liberals like detailed rules—complete with tedious forms, endless studies and wasteful legal hearings—because they don’t trust each other with discretion. Corporations like them because they provide not only certainty but also “a barrier to entry for potential competitors,” by raising the cost of doing business to prohibitive levels for small businesses with fresh ideas and other new entrants to markets. Public employees like them because detailed rules “absolve them of responsibility.” And, adds Mr. Howard, “lawsuits [have] exploded in this rules-based regime,” shifting legal power to “self-interested plaintiffs’ lawyers,” who have learned that they “could sue for the moon and extract settlements even in cases (as with some asbestos claims) that were fraudulent.”
So habituated have we become to such stuff, Mr. Howard says, that government’s “self-inflicted ineptitude is accepted as a state of nature, as if spending an average of eight years on environmental reviews—which should be a national scandal—were an unavoidable mountain range.” Common-sensical laws would place outer boundaries on acceptable conduct based on reasonable norms that are “far better at preventing abuse of power than today’s regulatory minefield.”
“As Mr. Howard notes, his book is part of a centuries-old rules-versus-principles debate. The philosophers and writers whom he quotes approvingly include Aristotle, James Madison, Isaiah Berlin and Roscoe Pound, a prominent Harvard law professor and dean who condemned “mechanical jurisprudence” and championed broad official discretion. Berlin, for his part, warned against “monstrous bureaucratic machines, built in accordance with the rules that ignore the teeming variety of the living world, the untidy and asymmetrical inner lives of men, and crush them into conformity.” Mr. Howard juxtaposes today’s roughly 100 million words of federal law and regulations with Madison’s warning that laws should not be “so voluminous that they cannot be read, or so incoherent that they cannot be understood.”…

Politics and the Internet


Edited book by William H. Dutton (Routledge – 2014 – 1,888 pages: “It is commonplace to observe that the Internet—and the dizzying technologies and applications which it continues to spawn—has revolutionized human communications. But, while the medium’s impact has apparently been immense, the nature of its political implications remains highly contested. To give but a few examples, the impact of networked individuals and institutions has prompted serious scholarly debates in political science and related disciplines on: the evolution of ‘e-government’ and ‘e-politics’ (especially after recent US presidential campaigns); electronic voting and other citizen participation; activism; privacy and surveillance; and the regulation and governance of cyberspace.
As research in and around politics and the Internet flourishes as never before, this new four-volume collection from Routledge’s acclaimed Critical Concepts in Political Science series meets the need for an authoritative reference work to make sense of a rapidly growing—and ever more complex—corpus of literature. Edited by William H. Dutton, Director of the Oxford Internet Institute (OII), the collection gathers foundational and canonical work, together with innovative and cutting-edge applications and interventions.
With a full index and comprehensive bibliographies, together with a new introduction by the editor, which places the collected material in its historical and intellectual context, Politics and the Internet is an essential work of reference. The collection will be particularly useful as a database allowing scattered and often fugitive material to be easily located. It will also be welcomed as a crucial tool permitting rapid access to less familiar—and sometimes overlooked—texts. For researchers, students, practitioners, and policy-makers, it is a vital one-stop research and pedagogic resource.”

Eight (No, Nine!) Problems With Big Data


Gary Marcus and Ernest Davis in the New York Times: “BIG data is suddenly everywhere. Everyone seems to be collecting it, analyzing it, making money from it and celebrating (or fearing) its powers. Whether we’re talking about analyzing zillions of Google search queries to predict flu outbreaks, or zillions of phone records to detect signs of terrorist activity, or zillions of airline stats to find the best time to buy plane tickets, big data is on the case. By combining the power of modern computing with the plentiful data of the digital era, it promises to solve virtually any problem — crime, public health, the evolution of grammar, the perils of dating — just by crunching the numbers.

Or so its champions allege. “In the next two decades,” the journalist Patrick Tucker writes in the latest big data manifesto, “The Naked Future,” “we will be able to predict huge areas of the future with far greater accuracy than ever before in human history, including events long thought to be beyond the realm of human inference.” Statistical correlations have never sounded so good.

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. “Jeopardy!” champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

Third, many tools that are based on big data can be easily gamed. For example, big data programs for grading student essays often rely on measures like sentence length and word sophistication, which are found to correlate well with the scores given by human graders. But once students figure out how such a program works, they start writing long sentences and using obscure words, rather than learning how to actually formulate and write clear, coherent text. Even Google’s celebrated search engine, rightly seen as a big data success story, is not immune to “Google bombing” and “spamdexing,” wily techniques for artificially elevating website search placement.

Fourth, even when the results of a big data analysis aren’t intentionally gamed, they often turn out to be less robust than they initially seem. Consider Google Flu Trends, once the poster child for big data. In 2009, Google reported — to considerable fanfare — that by analyzing flu-related search queries, it had been able to detect the spread of the flu as accurately and more quickly than the Centers for Disease Control and Prevention. A few years later, though, Google Flu Trends began to falter; for the last two years it has made more bad predictions than good ones.

As a recent article in the journal Science explained, one major contributing cause of the failures of Google Flu Trends may have been that the Google search engine itself constantly changes, such that patterns in data collected at one time do not necessarily apply to data collected at another time. As the statistician Kaiser Fung has noted, collections of big data that rely on web hits often merge data that was collected in different ways and with different purposes — sometimes to ill effect. It can be risky to draw conclusions from data sets of this kind.

A fifth concern might be called the echo-chamber effect, which also stems from the fact that much of big data comes from the web. Whenever the source of information for a big data analysis is itself a product of big data, opportunities for vicious cycles abound. Consider translation programs like Google Translate, which draw on many pairs of parallel texts from different languages — for example, the same Wikipedia entry in two different languages — to discern the patterns of translation between those languages. This is a perfectly reasonable strategy, except for the fact that with some of the less common languages, many of the Wikipedia articles themselves may have been written using Google Translate. In those cases, any initial errors in Google Translate infect Wikipedia, which is fed back into Google Translate, reinforcing the error.

A sixth worry is the risk of too many correlations. If you look 100 times for correlations between two variables, you risk finding, purely by chance, about five bogus correlations that appear statistically significant — even though there is no actual meaningful connection between the variables. Absent careful supervision, the magnitudes of big data can greatly amplify such errors.

Seventh, big data is prone to giving scientific-sounding solutions to hopelessly imprecise questions. In the past few months, for instance, there have been two separate attempts to rank people in terms of their “historical importance” or “cultural contributions,” based on data drawn from Wikipedia. One is the book “Who’s Bigger? Where Historical Figures Really Rank,” by the computer scientist Steven Skiena and the engineer Charles Ward. The other is an M.I.T. Media Lab project called Pantheon.

Both efforts get many things right — Jesus, Lincoln and Shakespeare were surely important people — but both also make some egregious errors. “Who’s Bigger?” claims that Francis Scott Key was the 19th most important poet in history; Pantheon has claimed that Nostradamus was the 20th most important writer in history, well ahead of Jane Austen (78th) and George Eliot (380th). Worse, both projects suggest a misleading degree of scientific precision with evaluations that are inherently vague, or even meaningless. Big data can reduce anything to a single number, but you shouldn’t be fooled by the appearance of exactitude.

FINALLY, big data is at its best when analyzing things that are extremely common, but often falls short when analyzing things that are less common. For instance, programs that use big data to deal with text, such as search engines and translation programs, often rely heavily on something called trigrams: sequences of three words in a row (like “in a row”). Reliable statistical information can be compiled about common trigrams, precisely because they appear frequently. But no existing body of data will ever be large enough to include all the trigrams that people might use, because of the continuing inventiveness of language.

To select an example more or less at random, a book review that the actor Rob Lowe recently wrote for this newspaper contained nine trigrams such as “dumbed-down escapist fare” that had never before appeared anywhere in all the petabytes of text indexed by Google. To witness the limitations that big data can have with novelty, Google-translate “dumbed-down escapist fare” into German and then back into English: out comes the incoherent “scaled-flight fare.” That is a long way from what Mr. Lowe intended — and from big data’s aspirations for translation.

Wait, we almost forgot one last problem: the hype….

Democracy in Retreat


Book by Joshua Kurlantzick (Council on Foreign Relations) on “The Revolt of the Middle Class and the Worldwide Decline of Representative Government”: “Since the end of the Cold War, most political theorists have assumed that as countries develop economically, they will also become more democratic—especially if a vibrant middle class takes root. The triumph of democracy, once limited to a tiny number of states and now spread across the globe, has been considered largely inevitable.
In Democracy in Retreat: The Revolt of the Middle Class and the Worldwide Decline of Representative Government, CFR Fellow for Southeast Asia Joshua Kurlantzick identifies forces that threaten democracy and shows that conventional wisdom has blinded world leaders to a real crisis. “Today a constellation of factors, from the rise of China to the lack of economic growth in new democracies to the West’s financial crisis, has come together to hinder democracy throughout the developing world,” he writes. “Absent radical and unlikely changes in the international system, that combination of antidemocratic factors will have serious staying power.”
Kurlantzick pays particular attention to the revolt of middle class citizens, traditionally proponents of reform, who have turned against democracy in countries such as Venezuela, Pakistan, and Taiwan. He observes that countries once held up as model new democracies, such as Hungary and the Czech Republic, have since curtailed social, economic, and political freedoms. Military coups have grabbed power from Honduras to Thailand to Fiji. The number of representative governments has fallen, and the quality of democracy has deteriorated in many states where it had been making progress, including Russia, Kenya, Argentina, and Nigeria.
The renewed strength of authoritarian rule, warns Kurlantzick, means that billions of people around the world continue to live under repressive regimes.”

Practices of Freedom: Decentred Governance, Conflict and Democratic Participation


New book by Steven Griggs,Aletta J. Norval, and Hendrik Wagenaar: “The shift from government to governance has become a starting point for many studies of contemporary policy-making and democracy. Practices of Freedom takes a different approach, calling into question this dominant narrative and taking the variety, hybridity and dispersion of social and political practices as its focus of analysis. Bringing together leading scholars in democratic theory and critical policy studies, it draws upon new understandings of radical democracy, practice and interpretative analysis to emphasise the productive role of actors and political conflict in the formation and reproduction of contemporary forms of democratic governance. Integrating theoretical dialogues with detailed empirical studies, this book examines spaces for democratisation, institutional design, democratic criteria and learning, whilst mobilising the frameworks of agonistic and aversive democracy, informality and decentred legitimacy in cases from youth engagement to the Israeli-Palestinian conflict.

  • Integrates topics including democratic theory, policy theory and social conflict to offer an interdisciplinary perspective on governance
  • Challenges assumptions about contemporary governance, reconnecting its practices with the wider civil society in which they are embedded
  • Detailed case studies of decentred governance illustrate theoretical points in real-world situations”

Social Media as Government Watchdog


Gordon Crovitz in the Wall Street Journal: “Two new data points for the debate on whether greater access to the Internet leads to more freedom and fewer authoritarian regimes:

According to reports last week, Facebook plans to buy a company that makes solar-powered drones that can hover for years at high altitudes without refueling, which it would use to bring the Internet to parts of the world not yet on the grid. In contrast to this futuristic vision, Russia evoked land grabs of the analog Soviet era by invading Crimea after Ukrainians forced out Vladimir Putin‘s ally as president.
Internet idealists can point to another triumph in helping bring down Ukraine’s authoritarian government. Ukrainian citizens ignored intimidation including officious text messages: “Dear subscriber, you are registered as a participant in a mass disturbance.” Protesters made the most of social media to plan demonstrations and avoid attacks by security forces.
But Mr. Putin quickly delivered the message that social media only goes so far against a fully committed authoritarian. His claim that he had to invade to protect ethnic Russians in Crimea was especially brazen because there had been no loud outcry, on social media or otherwise, among Russian speakers in the region.
A new book reports the state of play on the Internet as a force for freedom. For a decade, Emily Parker, a former Wall Street Journal editorial-page writer and State Department staffer, has researched the role of the Internet in China, Cuba and Russia. The title of her book, “Now I Know Who My Comrades Are,” comes from a blogger in China who explained to Ms. Parker how the Internet helps people discover they are not alone in their views and aspirations for liberty.
Officials in these countries work hard to keep critics isolated and in fear. In Russia, Ms. Parker notes, there is also apathy because the Putin regime seems so entrenched. “Revolutions need a spark, often in the form of a political or economic crisis,” she observes. “Social media alone will not light that spark. What the Internet does create is a new kind of citizen: networked, unafraid, and ready for action.”
Asked about lessons from the invasion of Crimea, Ms. Parker noted that the Internet “chips away at Russia’s control over information.” She added: “Even as Russian state media tries to shape the narrative about Ukraine, ordinary Russians can go online to seek the truth.”
But this same shared awareness may also be accelerating a decline in U.S. influence. In the digital era, U.S. failure to make good on its promises reduces the stature of Washington faster than similar inaction did in the past.
Consider the Hungarian uprising of 1956, the first significant rebellion against Soviet control. The U.S. secretary of state, John Foster Dulles, said: “To all those suffering under communist slavery, let us say you can count on us.” Yet no help came as Soviet tanks rolled into Budapest, tens of thousands were killed, and the leader who tried to secede from the Warsaw Pact, Imre Nagy, was executed.
There were no Facebook posts or YouTube videos instantly showing the result of U.S. fecklessness. In the digital era, scenes of Russian occupation of Crimea are available 24/7. People can watch Mr. Putin’s brazen press conferences and see for themselves what he gets away with.
The U.S. stood by as Syrian civilians were massacred and gassed. There was instant global awareness when President Obama last year backed down from enforcing his “red line” when the Syrian regime used chemical weapons. American inaction in Syria sent a green light for Mr. Putin and others around the world to act with impunity.
Just in recent weeks, Iran tried to ship Syrian rockets to Gaza to attack Israel; Moscow announced it would use bases in Cuba, Venezuela and Nicaragua for its navy and bombers; and China budgeted a double-digit increase in military spending as President Obama cut back the U.S. military.
All institutions are more at risk in this era of instant communication and awareness. Reputations get lost quickly, whether it’s a misstep by a company, a gaffe by a politician, or a lack of resolve by an American president.
Over time, the power of the Internet to bring people together will help undermine authoritarian governments. But as Mr. Putin reminds us, in the short term a peaceful world depends more on a U.S. resolute in using its power and influence to deter aggression.”