Technological Innovations and Future Shifts in International Politics


Paper by Askar Akaev and Vladimir Pantin in International Studies Quaterly: “How are large technological changes and important shifts in international politics interconnected? It is shown in the article that primary technological innovations, which take place in each Kondratieff cycle, change the balance of power between the leading states and cause shifts in international politics. In the beginning of the twenty-first century, the genesis and initial development of the cluster of new technologies takes place in periods of crisis and depression. Therefore, the authors forecast that the period 2013–2020 will be marked by the advancement of important technological innovations and massive geopolitical shifts in many regions of the world.”

Who Influences Whom? Reflections on U.S. Government Outreach to Think Tanks


Jeremy Shapiro at Brookings: “The U.S. government makes a big effort to reach out to important think tanks, often through the little noticed or understood mechanism of small, private and confidential roundtables. Indeed, for the ambitious Washington think-tanker nothing quite gets the pulse racing like the idea of attending one of these roundtables with the most important government officials. The very occasion is full of intrigue and ritual.

When the Government Calls for Advice

First, an understated e-mail arrives from some polite underling inviting you in to a “confidential, off-the-record” briefing with some official with an impressive title—a deputy secretary or a special assistant to the president, maybe even (heaven forfend) the secretary of state or the national security advisor. The thinker’s heart leaps, “they read my article; they finally see the light of my wisdom, I will probably be the next national security advisor.”
He clears his schedule of any conflicting brown bags on separatism in South Ossetia and, after a suitable interval to keep the government guessing as to his availability, replies that he might be able to squeeze it in to his schedule. Citizenship data and social security numbers are provided for security purposes, times are confirmed and ground rules are established in a multitude of emails with a seemingly never-ending array of staffers, all of whose titles include the word “special.” The thinker says nothing directly to his colleagues, but searches desperately for opportunities to obliquely allude to the meeting: “I’d love to come to your roundtable on uncovered interest rate parity, but I unfortunately have a meeting with the secretary of defense.”
On the appointed day, the thinker arrives early as instructed at an impressively massive and well-guarded government building, clears his ways through multiple layers of redundant security, and is ushered into a wood-paneled room that reeks of power and pine-sol. (Sometimes it is a futuristic conference room filled with television monitors and clocks that give the time wherever the President happens to be.) Nameless peons in sensible suits clutch government-issue notepads around the outer rim of the room as the thinker takes his seat at the center table, only somewhat disappointed to see so many other familiar thinkers in the room—including some to whom he had been obliquely hinting about the meeting the day before.
At the appointed hour, an officious staffer arrives to announce that “He” (the lead government official goes only by personal pronoun—names are unnecessary at this level) is unfortunately delayed at another meeting on the urgent international crisis of the day, but will arrive just as soon as he can get break away from the president in the Situation Room. He is, in fact, just reading email, but his long career has taught him the advantage of making people wait.
After 15 minutes of stilted chit-chat with colleagues that the thinker has the misfortune to see at virtually every event he attends in Washington, the senior government official strides calmly into the room, plops down at the head of the table and declares solemnly what a honor it is to have such distinguished experts to help with this critical area of policy. He very briefly details how very hard the U.S. government is working on this highest priority issue and declares that “we are in listening mode and are anxious to hear your sage advice.” A brave thinker raises his hand and speaks truth to power by reciting the thesis of his latest article. From there, the group is off to races as the thinkers each struggle to get in the conversation and rehearse their well-worn positions.
Forty-three minutes later, the thinkers’ “hour” is up because, the officious staffer interjects, “He” must attend a Principals Committee meeting. The senior government official thanks the experts for coming, compliments them on their fruitful ideas and their full and frank debate, instructs a nameless peon at random to assemble “what was learned here” for distribution in “the building” and strides purposefully out of the room.
The pantomime then ends and the thinker retreats back to his office to continue his thoughts. But what precisely has happened behind the rituals? Have we witnessed the vaunted academic-government exchange that Washington is so famous for? Is this how fresh ideas re-invigorate stale government groupthink?..”

Blueprint on "The Open Data Era in Health and Social Care"


The GovLab Press ReleaseNHS England and The Governance Lab at NYU (The GovLab) have today launched a blueprint – The Open Data Era in Health and Social Care – for accelerating the use of open data in health and care settings.
The availability of open data can empower citizens and help care providers, patients and researchers make better decisions, spur new innovations and identify efficiencies. The report was commissioned by NHS England and written by The GovLab, part of New York University and world leaders in the field of open data usage. It puts forward a proposal for how the health and care system can maximise the impact of sharing open data through establishing priorities and clear ways of measuring benefits.
Tim Kelsey, National Director for Patients and Information for NHS England, said:
“There’s an urgent need for the NHS to use better information and evidence to guide decision-making and investment. We know with scientific and medical research, the rate of discovery is accelerated by better access to data. This report will kick off a conversation about how we can use open data in the NHS to build a meaningful evidence base to support better investment in health and care services. Over the coming months, I’m keen to hear the views of colleagues on how we can take this forward and build an evidence base to improve outcomes for patients.”
Stefaan Verhulst, Co-founder and Chief Research and Development Officer of the GovLab:
“The blueprint lays out a detailed plan to start a conversation about how to gather the evidence needed to understand and assess the shape and size of the impact of open health data. It is important to pay a comparable level of attention to an analysis of open data’s potential benefits, as well as potential risks.”
Download the full report: thegovlab.org/nhs

US Secret Service seeks Twitter sarcasm detector


BBC: “The agency has put out a work tender looking for a software system to analyse social media data.
The software should have, among other things, the “ability to detect sarcasm and false positives”.
A spokesman for the service said it currently used the Federal Emergency Management Agency’s Twitter analytics and needed its own, adding: “We aren’t looking solely to detect sarcasm.”
The Washington Post quoted Ed Donovan as saying: “Our objective is to automate our social media monitoring process. Twitter is what we analyse.
“This is real-time stream analysis. The ability to detect sarcasm and false positives is just one of 16 or 18 things we are looking at.”…
The tender was put out earlier this week on the US government’s Federal Business Opportunities website.
It sets out the objectives of automating social media monitoring and “synthesising large sets of social media data”.
Specific requirements include “audience and geographic segmentation” and analysing “sentiment and trend”.
The software also has to have “compatibility with Internet Explorer 8”. The browser was released more than five years ago.
The agency does not detail the purpose of the analysis but does set out its mission, which includes “preserving the integrity of the economy and protecting national leaders and visiting heads of state and government”.

OSTP’s Own Open Government Plan


Nick Sinai and Corinna Zarek: “The White House Office of Science and Technology Policy (OSTP) today released its 2014 Open Government Plan. The OSTP plan highlights three flagship efforts as well as the team’s ongoing work to embed the open government principles of transparency, participation, and collaboration into its activities.
OSTP advises the President on the effects of science and technology on domestic and international affairs. The work of the office includes policy efforts encompassing science, environment, energy, national security, technology, and innovation. This plan builds off of the 2010 and 2012 Open Government Plans, updating progress on past initiatives and adding new subject areas based on 2014 guidance.
Agencies began releasing biennial Open Government Plans in 2010, with direction from the 2009 Open Government Directive. These plans serve as a roadmap for agency openness efforts, explaining existing practices and announcing new endeavors to be completed over the coming two years. Agencies build these plans in consultation with civil society stakeholders and the general public. Open government is a vital component of the President’s Management Agenda and our overall effort to ensure the government is expanding economic growth and opportunity for all Americans.
OSTP’s 2014 flagship efforts include:

  • Access to Scientific Collections: OSTP is leading agencies in developing policies that will improve the management of and access to scientific collections that agencies own or support. Scientific collections are assemblies of physical objects that are valuable for research and education—including drilling cores from the ocean floor and glaciers, seeds, space rocks, cells, mineral samples, fossils, and more. Agency policies will help make scientific collections and information about scientific collections more transparent and accessible in the coming years.
  • We the Geeks: We the Geeks Google+ Hangouts feature informal conversations with experts to highlight the future of science, technology, and innovation in the United States. Participants can join the conversation on Twitter by using the hashtag #WeTheGeeks and asking questions of the presenters throughout the hangout.
  • “All Hands on Deck” on STEM Education: OSTP is helping lead President Obama’s commitment to an “all-hands-on-deck approach” to providing students with skills they need to excel in science, technology, engineering, and math (STEM). In support of this goal, OSTP is bringing together government, industry, non-profits, philanthropy, and others to expand STEM education engagement and awareness through events like the annual White House Science Fair and the upcoming White House Maker Faire.

OSTP looks forward to implementing the 2014 Open Government Plan over the coming two years to continue building on its strong tradition of transparency, participation, and collaboration—with and for the American people.”

Cataloging the World


New book on “Paul Otlet and the Birth of the Information Age”: “The dream of capturing and organizing knowledge is as old as history. From the archives of ancient Sumeria and the Library of Alexandria to the Library of Congress and Wikipedia, humanity has wrestled with the problem of harnessing its intellectual output. The timeless quest for wisdom has been as much about information storage and retrieval as creative genius.
In Cataloging the World, Alex Wright introduces us to a figure who stands out in the long line of thinkers and idealists who devoted themselves to the task. Beginning in the late nineteenth century, Paul Otlet, a librarian by training, worked at expanding the potential of the catalog card, the world’s first information chip. From there followed universal libraries and museums, connecting his native Belgium to the world by means of a vast intellectual enterprise that attempted to organize and code everything ever published. Forty years before the first personal computer and fifty years before the first browser, Otlet envisioned a network of “electric telescopes” that would allow people everywhere to search through books, newspapers, photographs, and recordings, all linked together in what he termed, in 1934, a réseau mondial–essentially, a worldwide web.
Otlet’s life achievement was the construction of the Mundaneum–a mechanical collective brain that would house and disseminate everything ever committed to paper. Filled with analog machines such as telegraphs and sorters, the Mundaneum–what some have called a “Steampunk version of hypertext”–was the embodiment of Otlet’s ambitions. It was also short-lived. By the time the Nazis, who were pilfering libraries across Europe to collect information they thought useful, carted away Otlet’s collection in 1940, the dream had ended. Broken, Otlet died in 1944.
Wright’s engaging intellectual history gives Otlet his due, restoring him to his proper place in the long continuum of visionaries and pioneers who have struggled to classify knowledge, from H.G. Wells and Melvil Dewey to Vannevar Bush, Ted Nelson, Tim Berners-Lee, and Steve Jobs. Wright shows that in the years since Otlet’s death the world has witnessed the emergence of a global network that has proved him right about the possibilities–and the perils–of networked information, and his legacy persists in our digital world today, captured for all time…”

Open Data Is Open for Business


Jeffrey Stinson at Stateline: ” Last month, web designer Sean Wittmeyer and colleague Wojciech Magda walked away with a $25,000 prize from the state of Colorado for designing an online tool to help businesses decide where to locate in the state.
The tool, called “Beagle Score,” is a widget that can be embedded in online commercial real estate listings. It can rate a location by taxes and incentives, zoning, even the location of possible competitors – all derived from about 30 data sets posted publicly by the state of Colorado and its municipalities.
The creation of Beagle Score is an example of how states, cities, counties and the federal government are encouraging entrepreneurs to take raw government data posted on “open data” websites and turn the information into products the public will buy.
“The (Colorado contest) opened up a reason to use the data,” said Wittmeyer, 25, of Fort Collins. “It shows how ‘open data’ can solve a lot of challenges. … And absolutely, we can make it commercially viable. We can expand it to other states, and fairly quickly.”
Open-data advocates, such as President Barack Obama’s former information chief Vivek Kundra, estimate a multibillion-dollar industry can be spawned by taking raw government data files on sectors such as weather, population, energy, housing, commerce or transportation and turn them into products for the public to consume or other industries to pay for.
They can be as simple as mobile phone apps identifying every stop sign you will encounter on a trip to a different town, or as intricate as taking weather and crops data and turning it into insurance policies farmers can buy.

States, Cities Sponsor ‘Hackathons’

At least 39 states and 46 cities and counties have created open-data sites since the federal government, Utah, California and the cities of San Francisco and Washington, D.C., began opening data in 2009, according to the federal site, Data.gov.
Jeanne Holm, the federal government’s Data.gov “evangelist,” said new sites are popping up and new data are being posted almost daily. The city of Los Angeles, for example, opened a portal last week.
In March, Democratic New York Gov. Andrew Cuomo said that in the year since it was launched, his state’s site has grown to some 400 data sets with 50 million records from 45 agencies. Available are everything from horse injuries and deaths at state race tracks to maps of regulated child care centers. The most popular data: top fishing spots in the state.
State and local governments are sponsoring “hackathons,” “data paloozas,” and challenges like Colorado’s, inviting businesspeople, software developers, entrepreneurs or anyone with a laptop and a penchant for manipulating data to take part. Lexington, Kentucky, had a civic hackathon last weekend. The U.S. Transportation Department and members of the Geospatial Transportation Mapping Association had a three-day data palooza that ended Wednesday in Arlington, Virginia.
The goals of the events vary. Some, like Arlington’s transportation event, solicit ideas for how government can present its data more effectively. Others seek ideas for mining it.
Aldona Valicenti, Lexington’s chief information officer, said many cities want advice on how to use the data to make government more responsive to citizens, and to communicate with them on issues ranging from garbage pickups and snow removal to upcoming civic events.
Colorado and Wyoming had a joint hackathon last month sponsored by Google to help solve government problems. Colorado sought apps that might be useful to state emergency personnel in tracking people and moving supplies during floods, blizzards or other natural disasters. Wyoming sought help in making its tax-and-spend data more understandable and usable by its citizens.
Unless there’s some prize money, hackers may not make a buck from events like these, and participate out of fun, curiosity or a sense of public service. But those who create an app that is useful beyond the boundaries of a particular city or state, or one that is commercially valuable to business, can make serious money – just as Beagle Score plans to do. Colorado will hold onto the intellectual property rights to Beagle Score for a year. But Wittmeyer and his partner will be able to profit from extending it to other states.

States Trail in Open Data

Open data is an outgrowth of the e-government movement of the 1990s, in which government computerized more of the data it collected and began making it available on floppy disks.
States often have trailed the federal government or many cities in adjusting to the computer age and in sharing information, said Emily Shaw, national policy manager for the Sunlight Foundation, which promotes transparency in government. The first big push to share came with public accountability, or “checkbook” sites, that show where government gets its revenue and how it spends it.
The goal was to make government more transparent and accountable by offering taxpayers information on how their money was spent.
The Texas Comptroller of Public Accounts site, established in 2007, offers detailed revenue, spending, tax and contracts data. Republican Comptroller Susan Combs’ office said having a one-stop electronic site also has saved taxpayers about $12.3 million in labor, printing, postage and other costs.
Not all states’ checkbook sites are as openly transparent and detailed as Texas, Shaw said. Nor are their open-data sites. “There’s so much variation between the states,” she said.
Many state legislatures are working to set policies for releasing data. Since the start of 2010, according to the National Conference of State Legislatures, nine states have enacted open-data laws, and more legislation is pending. But California, for instance, has been posting open data for five years without legislation setting policies.
Just as states have lagged in getting data out to the public, less of it has been turned into commercial use, said Joel Gurin, senior adviser at the Governance Lab at New York University and author of the book “Open Data Now.”
Gurin leads Open Data 500, which identifies firms that that have made products from open government data and turned them into regional or national enterprises. In April, it listed 500. It soon may expand. “We’re finding more and more companies every day,” he said. “…

Humanitarians in the sky


Patrick Meier in the Guardian: “Unmanned aerial vehicles (UAVs) capture images faster, cheaper, and at a far higher resolution than satellite imagery. And as John DeRiggi speculates in “Drones for Development?” these attributes will likely lead to a host of applications in development work. In the humanitarian field that future is already upon us — so we need to take a rights-based approach to advance the discussion, improve coordination of UAV flights, and to promote regulation that will ensure safety while supporting innovation.
It was the unprecedentedly widespread use of civilian UAVs following typhoon Haiyan in the Philippines that opened my eyes to UAV use in post-disaster settings. I was in Manila to support the United Nations’ digital humanitarian efforts and came across new UAV projects every other day.
One team was flying rotary-wing UAVs to search for survivors among vast fields of debris that were otherwise inaccessible. Another flew fixed-wing UAVs around Tacloban to assess damage and produce high-quality digital maps. Months later, UAVs are still being used to support recovery and preparedness efforts. One group is working with local mayors to identify which communities are being overlooked in the reconstruction.
Humanitarian UAVs are hardly new. As far back as 2007, the World Food Program teamed up with the University of Torino to build humanitarian UAVs. But today UAVs are much cheaper, safer, and easier to fly. This means more people own personal UAVs. The distinguishing feature between these small UAVs and traditional remote control airplanes or helicopters is that they are intelligent. Most can be programmed to fly and land autonomously at designated locations. Newer UAVs also have on-board, flight-stabilization features that automatically adapt to changing winds, automated collision avoidance systems, and standard fail-safe mechanisms.
While I was surprised by the surge in UAV projects in the Philippines, I was troubled that none of these teams were aware of each other and that most were apparently not sharing their imagery with local communities. What happens when even more UAV teams show up following future disasters? Will they be accompanied by droves of drone journalists and “disaster tourists” equipped with personal UAVs? Will we see thousands of aerial disaster pictures and videos uploaded to social media rather than in the hands of local communities? What are the privacy implications? And what about empowering local communities to deploy their own UAVs?
There were many questions but few answers. So I launched the humanitarian UAV network (UAViators) to bridge the worlds of humanitarian professionals and UAV experts to address these questions. Our first priority was to draft a code of conduct for the use of UAVs in humanitarian settings to hold ourselves accountable while educating new UAV pilots before serious mistakes are made…”

Making cities smarter through citizen engagement


Vaidehi Shah at Eco-Business: “Rapidly progressing information communications technology (ICT) is giving rise to an almost infinite range of innovations that can be implemented in cities to make them more efficient and better connected. However, in order for technology to yield sustainable solutions, planners must prioritise citizen engagement and strong leadership.
This was the consensus on Tuesday at the World Cities Summit 2014, where representatives from city and national governments, technology firms and private sector organisations gathered in Singapore to discuss strategies and challenges to achieving sustainable cities in the future.
Laura Ipsen, Microsoft corporate vice president for worldwide public sector, identified globalisation, social media, big data, and mobility as the four major technological trends prevailing in cities today, as she spoke at the plenary session with a theme on “The next urban decade: critical challenges and opportunities”.
Despite these increasing trends, she cautioned, “technology does not build infrastructure, but it does help better engage citizens and businesses through public-private partnerships”.
For example, “LoveCleanStreets”, an online tool developed by Microsoft and partners, enables London residents to report infrastructure problems such as damaged roads or signs, shared Ipsen.
“By engaging citizens through this application, cities can fix problems early, before they get worse,” she said.
In Singapore, the ‘MyWaters’ app of PUB, Singapore’s national water agency, is also a key tool for the government to keep citizens up-to-date of water quality and safety issues in the country, she added.
Even if governments did not actively develop solutions themselves, simply making the immense amounts of data collected by the city open to businesses and citizens could make a big difference to urban liveability, Mark Chandler, director of the San Francisco Mayor’s Office of International Trade and Commerce, pointed out.
Opening up all of the data collected by San Francisco, for instance, yielded 60 free mobile applications that allow residents to access urban solutions related to public transport, parking, and electricity, among others, he explained. This easy and convenient access to infrastructure and amenities, which are a daily necessity, is integral to “a quality of life that keeps the talented workforce in the city,” Chandler said….”

Measuring Governance: What’s the point?


Alan Hudson at Global Integrity: “Over the last 10-15 years, the fact that governance – the institutional arrangements and relationships that shape how effectively things get done – plays a central role in shaping countries’ development trajectories has become widely acknowledged (see for instance the World Bank’s World Development Report of 2011). This acknowledgement has developed hand-in-hand with determined efforts to measure various aspects of governance.

This emphasis on governance and the efforts made to measure its patterns and understand its dynamics is very welcome. There’s no doubt that governance matters and measuring “governance” and its various dimensions can play a useful role in drawing attention to problems and opportunities, in monitoring compliance with standards, in evaluating efforts to support reform, and in informing decisions about what reforms to implement and how.

But in my experience, discussions about governance and its measurement sometimes gloss over a number of key questions (for a similar argument see the early sections of Matt Andrews’ piece on “Governance indicators can make sense”). These include questions about: what is being measured – “governance” is a multi-faceted and rather woolly concept (see Francis Fukuyama’s 2013 piece on “What is Governance?” and various responses); who is going to use the data that is generated; how that data might have an impact; and what results are being sought.

I’ve noticed this most recently in discussions about the inclusion of “governance” in the post-2015 development framework of goals, targets and indicators. From what I’ve seen, the understandable enthusiasm for ensuring that governance gains a place in the post-2015 framework can lead to discussions that: skate over the fact that the evidence that particular forms of governance – often labelled as “Good Governance” – lead to better development outcomes is patchy; fail to effectively grapple with the fact that a cookie-cutter approach to governance is unlikely to work across diverse contexts; pay little attention given to the ways in which the data generated might actually be used to make a difference; and, give scant consideration to the needs of those who might use the data, particularly citizens and citizens’ groups.

In my view, a failure to address these issues risks inadvertently weakening the case for paying attention to, and measuring, aspects of governance. As the Overseas Development Institute’s excellent report on “Governance targets and indicators for post-2015” put it, in diplomatic language: “including something as a target or indicator does not automatically lead to its improvement and the prize is not just to find governance targets and indicators that can be ‘measured’. Rather, it  may be important to reflect on the pathways through which set targets and indicators are thought to lead to better outcomes and on the incentives that might be generated by different measurement approaches.” (See my working document on “Fiscal Governance and Post-2015” for additional thoughts on the inclusion of governance in the post-2015 framework, including notes toward a theory of change).

More broadly, beyond the confines of the post-2015 debate, the risk – and arguably, in many cases, the reality – is that by paying insufficient attention to some key issues, we end up with a lot of data on various aspects of “governance”, but that that data doesn’t get used as much as it might, isn’t very useful for informing context-specific efforts to improve governance, and has limited impact.

To remedy this situation, I’d suggest that any effort to measure aspects of “governance” or to improve the availability, quality, use and impact of governance data (as the Governance Data Alliance is doing – with a Working Group on Problem Statements and Theories of Change) should answer up-front a series of simple questions:

  • Outcomes: What outcome(s) are you interested in? Are you interested in improving governance for its own sake, because you regard a particular type of governance as intrinsically valuable, and/or because you think, for instance, that improving governance will help to improve service delivery and accelerate progress against poverty? (See Nathaniel Heller’s post on “outputs versus outcomes in open government”)
  • Theory: If your interest is not solely based on the intrinsic value you attach to “governance”, which aspects of “governance” do you think matter in terms of the outcomes – e.g. service delivery and/or reduced poverty – that you’re interested in? What’s the theory of change that links governance to development outcomes? Without such a theory, it’s difficult to decide what to measure!
  • Data: In what ways do you think that data about the aspects of governance that you think are important – for intrinsic or extrinsic reasons – will be used to help to drive progress towards the type of governance that you value? To what use might the data be put, by whom, to do what? Or, from the perspective of data-users, what information do they need to take action to improve governance?

Organizations that are involved in generating governance data no doubt spend time considering these questions. But nonetheless, I think there would be value in making that thinking – and information about whether and how the data gets used, and with what effect – explicit….”