Springwise: “Modern technology has turned many of the things we consume from physical objects into pixels on a screen. While this has benefited the majority of us, those with sight difficulties don’t get along well with visual stimuli or touchscreen devices. In the past, we’ve seen Yahoo! Japan develop Hands On Search, a project that lets blind kids carry out web searches with 3D printed results. Now the country’s governmental department GSI is creating software that will enable those with visual impairments to print out 3D versions of online maps.
The official mapping body for Japan — much like the US Geological Survey — GSI already has paper maps for the blind, using embossed surfaces to mark out roads. It’s now developing a program that is able to do the same thing for digital maps.
The software first differentiates the highways, railway lines and walkways from the rest of the landscape. It then creates a 3D relief model that uses different textures to distinguish the features so that anyone running their finger along them will be able to determine what it is. The program also takes into account contour lines, creating accurate topographical representations of a particular area….
Website: www.gsi.go.jp“
Killer Apps in the Gigabit Age
New Pew report By Lee Rainie, Janna Anderson and Jennifer Connolly: “The age of gigabit connectivity is dawning and will advance in coming years. The only question is how quickly it might become widespread. A gigabit connection can deliver 1,000 megabits of information per second (Mbps). Globally, cloud service provider Akamai reports that the average global connection speed in quarter one of 2014 was 3.9 Mbps, with South Korea reporting the highest average connection speed, 23.6 Mbps and the US at 10.5 Mbps.1
In some respects, gigabit connectivity is not a new development. The US scientific community has been using hyper-fast networks for several years, changing the pace of data sharing and enabling levels of collaboration in scientific disciplines that were unimaginable a generation ago.
Gigabit speeds for the “average Internet user” are just arriving in select areas of the world. In the US, Google ran a competition in 2010 for communities to pitch themselves for the construction of the first Google Fiber network running at 1 gigabit per second—Internet speeds 50-100 times faster than the majority of Americans now enjoy. Kansas City was chosen among 1,100 entrants and residents are now signing up for the service. The firm has announced plans to build a gigabit network in Austin, Texas, and perhaps 34 other communities. In response, AT&T has said it expects to begin building gigabit networks in up to 100 US cities.2 The cities of Chattanooga, Tennessee; Lafayette, Louisiana; and Bristol, Virginia, have super speedy networks, and pockets of gigabit connectivity are in use in parts of Las Vegas, Omaha, Santa Monica, and several Vermont communities.3 There are also other regional efforts: Falcon Broadband in Colorado Springs, Colorado; Brooklyn Fiber in New York; Monkey Brains in San Francisco; MINET Fiber in Oregon; Wicked Fiber in Lawrence, Kansas; and Sonic.net in California, among others.4 NewWave expects to launch gigabit connections in 2015 in Poplar Bluff, Missouri Monroe, Rayville, Delhi; and Tallulah, Louisiana, and Suddenlink Communications has launched Operation GigaSpeed.5
In 2014, Google and Verizon were among the innovators announcing that they are testing the capabilities for currently installed fiber networks to carry data even more efficiently—at 10 gigabits per second—to businesses that handle large amounts of Internet traffic.
To explore the possibilities of the next leap in connectivity we asked thousands of experts and Internet builders to share their thoughts about likely new Internet activities and applications that might emerge in the gigabit age. We call this a canvassing because it is not a representative, randomized survey. Its findings emerge from an “opt in” invitation to experts, many of whom play active roles in Internet evolution as technology builders, researchers, managers, policymakers, marketers, and analysts. We also invited comments from those who have made insightful predictions to our previous queries about the future of the Internet. (For more details, please see the section “About this Canvassing of Experts.”)…”
CityBeat: Visualizing the Social Media Pulse of the City
“CityBeat is a an academic research project set to develop an application that sources, monitors and analyzes hyper-local information from multiple social media platforms such as Instagram and Twitter in real time.
This project was led by researchers at the Jacobs Institute at Cornell Tech, in collaboration with the The New York World (Columbia Journalism School), Rutgers University, NYU, and Columbia University….
If you are interested in the technical details, we have published several papers detailing the process of building CityBeat. Enjoy your read!
Xia C., Schwartz, R., Xie K., Krebs A., Langdon A., Ting J. and Naaman M., CityBeat: Real-time Social Media Visualization of Hyper-local City Data. In Proceedings, WWW 2014, Seoul, Korea, April 2014. [PDF]
Xie K., Xia C., Grinberg N., Schwartz R., and Naaman M., Robust detection of hyper-local events from geotagged social media data. In Proceedings of the 13th Workshop on Multimedia Data Mining in KDD, 2013. [PDF]
Schwartz, R., Naaman M., Matni, Z. (2013) Making Sense of Cities Using Social Media: Requirements for Hyper-Local Data Aggregation Tools. In Proceedings, WCMCW at ICWSM 2013, Boston, USA, July 2013. [PDF]
Data Mining Reveals How Social Coding Succeeds (And Fails)
Emerging Technology From the arXiv : “Collaborative software development can be hugely successful or fail spectacularly. An analysis of the metadata associated with these projects is teasing apart the difference….
The process of developing software has undergone huge transformation in the last decade or so. One of the key changes has been the evolution of social coding websites, such as GitHub and BitBucket.
These allow anyone to start a collaborative software project that other developers can contribute to on a voluntary basis. Millions of people have used these sites to build software, sometimes with extraordinary success.
Of course, some projects are more successful than others. And that raises an interesting question: what are the differences between successful and unsuccessful projects on these sites?
Today, we get an answer from Yuya Yoshikawa at the Nara Institute of Science and Technology in Japan and a couple of pals at the NTT Laboratories, also in Japan. These guys have analysed the characteristics of over 300,000 collaborative software projects on GitHub to tease apart the factors that contribute to success. Their results provide the first insights into social coding success from this kind of data mining.
A social coding project begins when a group of developers outline a project and begin work on it. These are the “internal developers” and have the power to update the software in a process known as a “commit”. The number of commits is a measure of the activity on the project.
External developers can follow the progress of the project by “starring” it, a form of bookmarking on GitHub. The number of stars is a measure of the project’s popularity. These external developers can also request changes, such as additional features and so on, in a process known as a pull request.
Yoshikawa and co begin by downloading the data associated with over 300,000 projects from the GitHub website. This includes the number of internal developers, the number of stars a project receives over time and the number of pull requests it gets.
The team then analyse the effectiveness of the project by calculating factors such as the number of commits per internal team member, the popularity of the project over time, the number of pull requests that are fulfilled and so on.
The results provide a fascinating insight into the nature of social coding. Yoshikawa and co say the number of internal developers on a project plays a significant role in its success. “Projects with larger numbers of internal members have higher activity, popularity and sociality,” they say….
Ref: arxiv.org/abs/1408.6012 : Collaboration on Social Media: Analyzing Successful Projects on Social Coding”
Rethinking Democracy
Dani Rodrik at Project Syndicate: “By many measures, the world has never been more democratic. Virtually every government at least pays lip service to democracy and human rights. Though elections may not be free and fair, massive electoral manipulation is rare and the days when only males, whites, or the rich could vote are long gone. Freedom House’s global surveys show a steady increase from the 1970s in the share of countries that are “free” – a trend that the late Harvard political scientist Samuel Huntington dubbed the “third wave” of democratization….
A true democracy, one that combines majority rule with respect for minority rights, requires two sets of institutions. First, institutions of representation, such as political parties, parliaments, and electoral systems, are needed to elicit popular preferences and turn them into policy action. Second, democracy requires institutions of restraint, such as an independent judiciary and media, to uphold fundamental rights like freedom of speech and prevent governments from abusing their power. Representation without restraint – elections without the rule of law – is a recipe for the tyranny of the majority.
Democracy in this sense – what many call “liberal democracy” – flourished only after the emergence of the nation-state and the popular upheaval and mobilization produced by the Industrial Revolution. So it should come as no surprise that the crisis of liberal democracy that many of its oldest practitioners currently are experiencing is a reflection of the stress under which the nation-state finds itself….
In developing countries, it is more often the institutions of restraint that are failing. Governments that come to power through the ballot box often become corrupt and power-hungry. They replicate the practices of the elitist regimes they replaced, clamping down on the press and civil liberties and emasculating (or capturing) the judiciary. The result has been called “illiberal democracy” or “competitive authoritarianism.” Venezuela, Turkey, Egypt, and Thailand are some of the better-known recent examples.
When democracy fails to deliver economically or politically, perhaps it is to be expected that some people will look for authoritarian solutions. And, for many economists, delegating economic policy to technocratic bodies in order to insulate them from the “folly of the masses” almost always is the preferred approach.
…
Effective institutions of restraint do not emerge overnight; and it might seem like those in power would never want to create them. But if there is some likelihood that I will be voted out of office and that the opposition will take over, such institutions will protect me from others’ abuses tomorrow as much as they protect others from my abuses today. So strong prospects for sustained political competition are a key prerequisite for illiberal democracies to turn into liberal ones over time.
Optimists believe that new technologies and modes of governance will resolve all problems and send democracies centered on the nation-state the way of the horse-drawn carriage. Pessimists fear that today’s liberal democracies will be no match for the external challenges mounted by illiberal states like China and Russia, which are guided only by hardnosed realpolitik. Either way, if democracy is to have a future, it will need to be rethought.”
Monitoring Arms Control Compliance With Web Intelligence
Chris Holden and Maynard Holliday at Commons Lab: “Traditional monitoring of arms control treaties, agreements, and commitments has required the use of National Technical Means (NTM)—large satellites, phased array radars, and other technological solutions. NTM was a good solution when the treaties focused on large items for observation, such as missile silos or nuclear test facilities. As the targets of interest have shrunk by orders of magnitude, the need for other, more ubiquitous, sensor capabilities has increased. The rise in web-based, or cloud-based, analytic capabilities will have a significant influence on the future of arms control monitoring and the role of citizen involvement.
Since 1999, the U.S. Department of State has had at its disposal the Key Verification Assets Fund (V Fund), which was established by Congress. The Fund helps preserve critical verification assets and promotes the development of new technologies that support the verification of and compliance with arms control, nonproliferation, and disarmament requirements.
Sponsored by the V Fund to advance web-based analytic capabilities, Sandia National Laboratories, in collaboration with Recorded Future (RF), synthesized open-source data streams from a wide variety of traditional and nontraditional web sources in multiple languages along with topical texts and articles on national security policy to determine the efficacy of monitoring chemical and biological arms control agreements and compliance. The team used novel technology involving linguistic algorithms to extract temporal signals from unstructured text and organize that unstructured text into a multidimensional structure for analysis. In doing so, the algorithm identifies the underlying associations between entities and events across documents and sources over time. Using this capability, the team analyzed several events that could serve as analogs to treaty noncompliance, technical breakout, or an intentional attack. These events included the H7N9 bird flu outbreak in China, the Shanghai pig die-off and the fungal meningitis outbreak in the United States last year.
For H7N9 we found that open source social media were the first to report the outbreak and give ongoing updates. The Sandia RF system was able to roughly estimate lethality based on temporal hospitalization and fatality reporting. For the Shanghai pig die-off the analysis tracked the rapid assessment by Chinese authorities that H7N9 was not the cause of the pig die-off as had been originally speculated. Open source reporting highlighted a reduced market for pork in China due to the very public dead pig display in Shanghai. Possible downstream health effects were predicted (e.g., contaminated water supply and other overall food ecosystem concerns). In addition, legitimate U.S. food security concerns were raised based on the Chinese purchase of the largest U.S. pork producer (Smithfield) because of a fear of potential import of tainted pork into the United States….
To read the full paper, please click here.”
The Social Laboratory
Shane Harris in Foreign Policy: “…, Singapore has become a laboratory not only for testing how mass surveillance and big-data analysis might prevent terrorism, but for determining whether technology can be used to engineer a more harmonious society….Months after the virus abated, Ho and his colleagues ran a simulation using Poindexter’s TIA ideas to see whether they could have detected the outbreak. Ho will not reveal what forms of information he and his colleagues used — by U.S. standards, Singapore’s privacy laws are virtually nonexistent, and it’s possible that the government collected private communications, financial data, public transportation records, and medical information without any court approval or private consent — but Ho claims that the experiment was very encouraging. It showed that if Singapore had previously installed a big-data analysis system, it could have spotted the signs of a potential outbreak two months before the virus hit the country’s shores. Prior to the SARS outbreak, for example, there were reports of strange, unexplained lung infections in China. Threads of information like that, if woven together, could in theory warn analysts of pending crises.
The RAHS system was operational a year later, and it immediately began “canvassing a range of sources for weak signals of potential future shocks,” one senior Singaporean security official involved in the launch later recalled.
The system uses a mixture of proprietary and commercial technology and is based on a “cognitive model” designed to mimic the human thought process — a key design feature influenced by Poindexter’s TIA system. RAHS, itself, doesn’t think. It’s a tool that helps human beings sift huge stores of data for clues on just about everything. It is designed to analyze information from practically any source — the input is almost incidental — and to create models that can be used to forecast potential events. Those scenarios can then be shared across the Singaporean government and be picked up by whatever ministry or department might find them useful. Using a repository of information called an ideas database, RAHS and its teams of analysts create “narratives” about how various threats or strategic opportunities might play out. The point is not so much to predict the future as to envision a number of potential futures that can tell the government what to watch and when to dig further.
The officials running RAHS today are tight-lipped about exactly what data they monitor, though they acknowledge that a significant portion of “articles” in their databases come from publicly available information, including news reports, blog posts, Facebook updates, and Twitter messages. (“These articles have been trawled in by robots or uploaded manually” by analysts, says one program document.) But RAHS doesn’t need to rely only on open-source material or even the sorts of intelligence that most governments routinely collect: In Singapore, electronic surveillance of residents and visitors is pervasive and widely accepted…”
Sharing Data Is a Form of Corporate Philanthropy
Matt Stempeck in HBR Blog: “Ever since the International Charter on Space and Major Disasters was signed in 1999, satellite companies like DMC International Imaging have had a clear protocol with which to provide valuable imagery to public actors in times of crisis. In a single week this February, DMCii tasked its fleet of satellites on flooding in the United Kingdom, fires in India, floods in Zimbabwe, and snow in South Korea. Official crisis response departments and relevant UN departments can request on-demand access to the visuals captured by these “eyes in the sky” to better assess damage and coordinate relief efforts.
Back on Earth, companies create, collect, and mine data in their day-to-day business. This data has quickly emerged as one of this century’s most vital assets. Public sector and social good organizations may not have access to the same amount, quality, or frequency of data. This imbalance has inspired a new category of corporate giving foreshadowed by the 1999 Space Charter: data philanthropy.
The satellite imagery example is an area of obvious societal value, but data philanthropy holds even stronger potential closer to home, where a wide range of private companies could give back in meaningful ways by contributing data to public actors. Consider two promising contexts for data philanthropy: responsive cities and academic research.
The centralized institutions of the 20th century allowed for the most sophisticated economic and urban planning to date. But in recent decades, the information revolution has helped the private sector speed ahead in data aggregation, analysis, and applications. It’s well known that there’s enormous value in real-time usage of data in the private sector, but there are similarly huge gains to be won in the application of real-time data to mitigate common challenges.
What if sharing economy companies shared their real-time housing, transit, and economic data with city governments or public interest groups? For example, Uber maintains a “God’s Eye view” of every driver on the road in a city:
Imagine combining this single data feed with an entire portfolio of real-time information. An early leader in this space is the City of Chicago’s urban data dashboard, WindyGrid. The dashboard aggregates an ever-growing variety of public datasets to allow for more intelligent urban management.
Over time, we could design responsive cities that react to this data. A responsive city is one where services, infrastructure, and even policies can flexibly respond to the rhythms of its denizens in real-time. Private sector data contributions could greatly accelerate these nascent efforts.
Data philanthropy could similarly benefit academia. Access to data remains an unfortunate barrier to entry for many researchers. The result is that only researchers with access to certain data, such as full-volume social media streams, can analyze and produce knowledge from this compelling information. Twitter, for example, sells access to a range of real-time APIs to marketing platforms, but the price point often exceeds researchers’ budgets. To accelerate the pursuit of knowledge, Twitter has piloted a program called Data Grants offering access to segments of their real-time global trove to select groups of researchers. With this program, academics and other researchers can apply to receive access to relevant bulk data downloads, such as an period of time before and after an election, or a certain geographic area.
Humanitarian response, urban planning, and academia are just three sectors within which private data can be donated to improve the public condition. There are many more possible applications possible, but few examples to date. For companies looking to expand their corporate social responsibility initiatives, sharing data should be part of the conversation…
Companies considering data philanthropy can take the following steps:
- Inventory the information your company produces, collects, and analyzes. Consider which data would be easy to share and which data will require long-term effort.
- Think who could benefit from this information. Who in your community doesn’t have access to this information?
- Who could be harmed by the release of this data? If the datasets are about people, have they consented to its release? (i.e. don’t pull a Facebook emotional manipulation experiment).
- Begin conversations with relevant public agencies and nonprofit partners to get a sense of the sort of information they might find valuable and their capacity to work with the formats you might eventually make available.
- If you expect an onslaught of interest, an application process can help qualify partnership opportunities to maximize positive impact relative to time invested in the program.
- Consider how you’ll handle distribution of the data to partners. Even if you don’t have the resources to set up an API, regular releases of bulk data could still provide enormous value to organizations used to relying on less-frequently updated government indices.
- Consider your needs regarding privacy and anonymization. Strip the data of anything remotely resembling personally identifiable information (here are some guidelines).
- If you’re making data available to researchers, plan to allow researchers to publish their results without obstruction. You might also require them to share the findings with the world under Open Access terms….”
Chief Executive of Nesta on the Future of Government Innovation
Interview between Rahim Kanani and Geoff Mulgan, CEO of NESTA and member of the MacArthur Research Network on Opening Governance: “Our aspiration is to become a global center of expertise on all kinds of innovation, from how to back creative business start-ups and how to shape innovations tools such as challenge prizes, to helping governments act as catalysts for new solutions,” explained Geoff Mulgan, chief executive of Nesta, the UK’s innovation foundation. In an interview with Mulgan, we discussed their new report, published in partnership with Bloomberg Philanthropies, which highlights 20 of the world’s top innovation teams in government. Mulgan and I also discussed the founding and evolution of Nesta over the past few years, and leadership lessons from his time inside and outside government.
Rahim Kanani: When we talk about ‘innovations in government’, isn’t that an oxymoron?
Geoff Mulgan: Governments have always innovated. The Internet and World Wide Web both originated in public organizations, and governments are constantly developing new ideas, from public health systems to carbon trading schemes, online tax filing to high speed rail networks. But they’re much less systematic at innovation than the best in business and science. There are very few job roles, especially at senior levels, few budgets, and few teams or units. So although there are plenty of creative individuals in the public sector, they succeed despite, not because of the systems around them. Risk-taking is punished not rewarded. Over the last century, by contrast, the best businesses have learned how to run R&D departments, product development teams, open innovation processes and reasonably sophisticated ways of tracking investments and returns.
Kanani: This new report, published in partnership with Bloomberg Philanthropies, highlights 20 of the world’s most effective innovation teams in government working to address a range of issues, from reducing murder rates to promoting economic growth. Before I get to the results, how did this project come about, and why is it so important?
Mulgan: If you fail to generate new ideas, test them and scale the ones that work, it’s inevitable that productivity will stagnate and governments will fail to keep up with public expectations, particularly when waves of new technology—from smart phones and the cloud to big data—are opening up dramatic new possibilities. Mayor Bloomberg has been a leading advocate for innovation in the public sector, and in New York he showed the virtues of energetic experiment, combined with rigorous measurement of results. In the UK, organizations like Nesta have approached innovation in a very similar way, so it seemed timely to collaborate on a study of the state of the field, particularly since we were regularly being approached by governments wanting to set up new teams and asking for guidance.
Kanani: Where are some of the most effective innovation teams working on these issues, and how did you find them?
Mulgan: In our own work at Nesta, we’ve regularly sought out the best innovation teams that we could learn from and this study made it possible to do that more systematically, focusing in particular on the teams within national and city governments. They vary greatly, but all the best ones are achieving impact with relatively slim resources. Some are based in central governments, like Mindlab in Denmark, which has pioneered the use of design methods to reshape government services, from small business licensing to welfare. SITRA in Finland has been going for decades as a public technology agency, and more recently has switched its attention to innovation in public services. For example, providing mobile tools to help patients manage their own healthcare. In the city of Seoul, the Mayor set up an innovation team to accelerate the adoption of ‘sharing’ tools, so that people could share things like cars, freeing money for other things. In south Australia the government set up an innovation agency that has been pioneering radical ways of helping troubled families, mobilizing families to help other families.
Kanani: What surprised you the most about the outcomes of this research?
Mulgan: Perhaps the biggest surprise has been the speed with which this idea is spreading. Since we started the research, we’ve come across new teams being created in dozens of countries, from Canada and New Zealand to Cambodia and Chile. China has set up a mobile technology lab for city governments. Mexico City and many others have set up labs focused on creative uses of open data. A batch of cities across the US supported by Bloomberg Philanthropy—from Memphis and New Orleans to Boston and Philadelphia—are now showing impressive results and persuading others to copy them.
Are the Authoritarians Winning?
Review of several books by Michael Ignatieff in the New York Review of Books: “In the 1930s travelers returned from Mussolini’s Italy, Stalin’s Russia, and Hitler’s Germany praising the hearty sense of common purpose they saw there, compared to which their own democracies seemed weak, inefficient, and pusillanimous.
Democracies today are in the middle of a similar period of envy and despondency. Authoritarian competitors are aglow with arrogant confidence. In the 1930s, Westerners went to Russia to admire Stalin’s Moscow subway stations; today they go to China to take the bullet train from Beijing to Shanghai, and just as in the 1930s, they return wondering why autocracies can build high-speed railroad lines seemingly overnight, while democracies can take forty years to decide they cannot even begin. The Francis Fukuyama moment—when in 1989 Westerners were told that liberal democracy was the final form toward which all political striving was directed—now looks like a quaint artifact of a vanished unipolar moment.
For the first time since the end of the cold war, the advance of democratic constitutionalism has stopped. The army has staged a coup in Thailand and it’s unclear whether the generals will allow democracy to take root in Burma. For every African state, like Ghana, where democratic institutions seem secure, there is a Mali, a Côte d’Ivoire, and a Zimbabwe, where democracy is in trouble.
In Latin America, democracy has sunk solid roots in Chile, but in Mexico and Colombia it is threatened by violence, while in Argentina it struggles to shake off the dead weight of Peronism. In Brazil, the millions who took to the streets last June to protest corruption seem to have had no impact on the cronyism in Brasília. In the Middle East, democracy has a foothold in Tunisia, but in Syria there is chaos; in Egypt, plebiscitary authoritarianism rules; and in the monarchies, absolutism is ascendant.
In Europe, the policy elites keep insisting that the remedy for their continent’s woes is “more Europe” while a third of their electorate is saying they want less of it. From Hungary to Holland, including in France and the UK, the anti-European right gains ground by opposing the European Union generally and immigration in particular. In Russia the democratic moment of the 1990s now seems as distant as the brief constitutional interlude between 1905 and 1914 under the tsar….
It is not at all apparent that “governance innovation,” a bauble Micklethwait and Wooldridge chase across three continents, watching innovators at work making government more efficient in Chicago, Sacramento, Singapore, and Stockholm, will do the trick. The problem of the liberal state is not that it lacks modern management technique, good software, or different schemes to improve the “interface” between the bureaucrats and the public. By focusing on government innovation, Micklethwait and Wooldridge assume that the problem is improving the efficiency of government. But what is required is both more radical and more traditional: a return to constitutional democracy itself, to courts and regulatory bodies that are freed from the power of money and the influence of the powerful; to legislatures that cease to be circuses and return to holding the executive branch to public account while cooperating on measures for which there is a broad consensus; to elected chief executives who understand that they are not entertainers but leaders….”
Books reviewed: