Explore our articles
View All Results

Stefaan Verhulst

MIT Technology Review: “Mobile phones have generated enormous insight into the human condition thanks largely to the study of the data they produce. Mobile phone companies record the time of each call, the caller and receiver ids, as well as the locations of the cell towers involved, among other things.
The combined data from millions of people produces some fascinating new insights in the nature of our society. Anthropologists have crunched it to reveal human reproductive strategiesa universal law of commuting and even the distribution of wealth in Africa.
Today, computer scientists have gone one step further by using mobile phone data to map the structure of cities and how people use them throughout the day. “These results point towards the possibility of a new, quantitative classification of cities using high resolution spatio-temporal data,” say Thomas Louail at the Institut de Physique Théorique in Paris and a few pals.
They say their work is part of a new science of cities that aims to objectively measure and understand the nature of large population centers.
These guys begin with a database of mobile phone calls made by people in the 31 Spanish cities that have populations larger than 200,000. The data consists of the number of unique individuals using a given cell tower (whether making a call or not) for each hour of the day over almost two months….The results reveal some fascinating patterns in city structure. For a start, every city undergoes a kind of respiration in which people converge into the center and then withdraw on a daily basis, almost like breathing. And this happens in all cities. This “suggests the existence of a single ‘urban rhythm’ common to all cities,” say Louail and co.
During the week, the number of phone users peaks at about midday and then again at about 6 p.m. During the weekend the numbers peak a little later: at 1 p.m. and 8 p.m. Interestingly, the second peak starts about an hour later in western cities, such as Sevilla and Cordoba.
The data also reveals that small cities tend to have a single center that becomes busy during the day, such as the cities of Salamanca and Vitoria.
But it also shows that the number of hotspots increases with city size; so-called polycentric cities include Spain’s largest, such as Madrid, Barcelona, and Bilboa.
That could turn out to be useful for automatically classifying cities.
There is a growing interest in the nature of cities, the way they evolve and how their residents use them. The goal of this new science is to make better use of these spaces that more than 50 percent of the planet inhabit. Louail and co show that mobile phone data clearly has an important role to play in this endeavor to better understanding these complex giants.
Ref: arxiv.org/abs/1401.4540 : From Mobile Phone Data To The Spatial Structure Of Cities”

How a New Science of Cities Is Emerging from Mobile Phone Data Analysis
“The Open Government Partnership has many attributes that make it stand out from other multilateral initiatives. The central role for civil society, the focus on supporting domestic reformers, and the diverse mix of countries in leadership roles, are all cited as organisational strengths. In February it will be the turn of OGP’s unique accountability mechanism, which is set up to be entirely independent and makes all of its findings public, to take centre stage. The Independent Reporting Mechanism will be publishing 35 progress reports over the next month. These are check-ins on how the large group of countries who formally joined OGP at the Brasilia Summit in April 2012 are doing against their open government reform commitments. The reports examine individual commitments from the National Action Plans, as well as the quality of the consultation process and dialogue between civil society and the government. The executive summaries will highlight the star commitments that saw tremendous progress, and were the most ambitious in terms of potential impact. These reports come at an important time for OGP. All the countries receiving reports are embarking on their second National Action Plan, due for publication on June 15th 2014. (Over 2/3 of OGP participating countries are currently developing new action plans.) The recommendations made by the IRM are designed to feed into the process of creating the new plans, making specific suggestions to improve the ambition and quality of new commitments and civil society engagement. However, these recommendations will only be acted upon if they are widely publicized at the national level and used by both civil society and government officials. If the reports remain unread, the likelihood of meaningful reforms through OGP will decrease…”
OGP’s Independent Reporting Mechanism to Publish 35 Reports

Paper by Reades J. and Smith D. A. in Regional Studies on the Geography of Global Business Telecommunications and Employment Specialization in the London Mega-City-Region: “Telecommunications has radically reshaped the way that firms organize industrial activity. And yet, because much of this technology – and the interactions that it enables – is invisible, the corporate ‘space of flows’ remains poorly mapped. This article combines detailed employment and telecoms usage data for the South-east of England to build a sector-by-sector profile of globalization at the mega-city-region scale. The intersection of these two datasets allows a new empirical perspective on industrial geography and regional structure to be developed.”

Mapping the ‘Space of Flows’

New report by the Information Technology and Innovation Foundation (ITIF): “Businesses, government agencies, and non-profits in countries around the world are transforming virtually every facet of the economy and society through innovative uses of data. These changes, brought about by new technologies and techniques for collecting, storing, analyzing, disseminating, and visualizing data, are improving the quality of life for billions of individuals around the world, opening up new economic opportunities, and creating more efficient and effective governments. This list provides a sampling, in no particular order, of some of the most interesting and important contributions data-driven innovations have made in the past year. (Download)”
 

100 Data Innovations

Wired (UK): “We’re not yet sure if diplomacy is going digital or just the conversations we’re having,” Moira Whelan, Deputy Assistant Secretary for Digital Strategy, US Department of State, admitted on stage at TedxStockholm. “Sometimes you just have to dive in, and we’re going to, but we’re not really sure where we’re going.”
The US has been at the forefront of digital diplomacy for many years now. President Obama was the first leader to sign up to Twitter, and has amassed the greatest number of followers among his peers at nearly 41 million. The account is, however, mainly run by his staff. It’s understandable, but demonstrates that there still remains a diplomatic disconnect in a country Whelan says knows it’s “ready, leading the conversation and on cutting edge”.
In Europe  Swedish Minister for Foreign Affairs Carl Bildt, on the other hand, carries out regular Q&As on the social network and is regarded as one of the most conversational leaders on Twitter and the best connected, according to annual survey Twiplomacy. Our own William Hague is chasing Bildt with close to 200,000 followers, and is the world’s second most connected Foreign Minister, while David Cameron is active on a daily basis with more than 570,000 followers. London was in fact the first place to host a “Diplohack”, an event where ambassadors are brought together with developers and others to hack traditional diplomacy, and Whelan travelled to Sweden to take place in the third European event, the Stockholm Initiative for Digital Diplomacy held 16-17 January in conjunction with TedxStockholm.
Nevertheless, Whelan, who has worked for the state for a decade, says the US is in the game and ready to try new things. Case in point being its digital diplomacy reaction to the crisis in Syria last year.
“In August 2013 we witnessed tragic events in Syria, and obviously the President of the United States and his security team jumped into action,” said Whelan. “We needed to bear witness and… very clearly saw the need for one thing — a Google+ Hangout.” With her tongue-in-cheek comment, Whelan was pointing out social media’s incredibly relevant role in communicating to the public what’s going on when crises hit, and in answering concerns and questions through it.
“We saw speeches and very disturbing images coming at us,” continued Whelan. “We heard leaders making impassioned speeches, and we ourselves had conversations about what we were seeing and how we needed to engage and inform; to give people the chance to engage and ask questions of us.
“We thought, clearly let’s have a Google+ Hangout. Three people joined us and Secretary John Kerry — Nicholas Kirstof of the New York Times, executive editor of Syria Deeply, Lara Setrakian and Andrew Beiter, a teacher affiliated with the Holocaust Memorial Museum who specialises in how we talk about these topics with our children.”
In the run up to the Hangout, news of the event trickled out and soon Google was calling, asking if it could advertise the session at the bottom of other Hangouts, then on YouTube ads. “Suddenly 15,000 people were watching the Secretary live — that’s by far largest number we’d seen. We felt we’d tapped into something, we knew we’d hit success at what was a challenging time. We were engaging the public and could join with them to communicate a set of questions. People want to ask questions and get very direct answers, and we know it’s a success. We’ve talked to Google about how we can replicate that. We want to transform what we’re doing to make that the norm.”
Secretary of State John Kerry is, Whelan told Wired.co.uk later, “game for anything” when it comes to social media — and having the department leader enthused at the prospect of taking digital diplomacy forward is obviously key to its success.
“He wanted us to get on Instagram and the unselfie meme during the Philippines crisis was his idea — an assistant had seen it and he held a paper in front of him with the URL to donate funds to Typhoon Haiyan victims,” Whelan told Wired.co.uk at the Stockholm diplohack.  “President Obama came in with a mandate that social media would be present and pronounced in all our departments.”
“[As] government changes and is more influenced away from old paper models and newspapers, suspenders and bow ties, and more into young innovators wanting to come in and change things,” Whelan continued, “I think it will change the way we work and help us get smarter.”

Google Hangouts vs Twitter Q&As: how the US and Europe are hacking traditional diplomacy

Tom Slee: “A new wave of technology companies claims to be expanding the possibilities of sharing and collaboration, and is clashing with established industries such as hospitality and transit. These companies make up what is being called the “sharing economy”: they provide web sites and applications through which individual residents or drivers can offer to “share” their apartment or car with a guest, for a price.
The industries they threaten have long been subject to city-level consumer protection and zoning regulations, but sharing economy advocates claim that these rules are rendered obsolete by the Internet. Battle lines are being drawn between the new companies and city governments. Where’s a good leftist to stand in all of this?
To figure this out, we need to look at the nature of the sharing economy. Some would say it fits squarely into an ideology of unregulated free markets, as described recently by David Golumbia here in Jacobin. Others note that the people involved in American technology industries lean liberal. There’s also a clear Euro/American split in the sharing economy: while the Americans are entrepreneurial and commercial in the way they drive the initiative, the Europeans focus more on the civic, the collaborative, and the non-commercial.
The sharing economy invokes values familiar to many on the Left: decentralization, sustainability, community-level connectedness, and opposition to hierarchical and rigid regulatory regimes, seen mostly clearly in the movement’s bible What’s Mine is Yours: The Rise of Collaborative Consumption by Rachel Botsman and Roo Rogers. It’s the language of co-operatives and of civic groups.
There’s a definite green slant to the movement, too: ideas of “sharing rather than owning” make appeals to sustainability, and the language of sharing also appeals to anti-consumerist sentiments popular on the Left: property and consumption do not make us happy, and we should put aside the pursuit of possessions in favour of connections and experiences. All of which leads us to ideas of community: the sharing economy invokes images of neighbourhoods, villages, and “human-scale” interactions. Instead of buying from a mega-store, we get to share with neighbours.
These ideals have been around for centuries, but the Internet has given them a new slant. An influential line of thought emphasizes that the web lowers the “transaction costs” of group formation and collaboration. The key text is Yochai Benkler’s 2006 book The Wealth of Networks, which argues that the Internet brings with it an alternative style of economic production: networked rather than managed, self-organized rather than ordered. It’s a language associated strongly with both the Left (who see it as an alternative to monopoly capital), and the free-market libertarian right (who see it as an alternative to the state).
Clay Shirky’s 2008 book Here Comes Everybody popularized the ideas further, and in 2012 Steven Johnson announced the appearance of the “Peer Progressive” in his book Future Perfect. The idea of internet-enabled collaboration in the “real” world is a next step from online collaboration in the form of open source software, open government data, and Wikipedia, and the sharing economy is its manifestation.
As with all things technological, there’s an additional angle: the involvement of capital…”

Sharing and Caring

“Should all politicians have to launch a startup before entering politics? That’s the question I asked California’s Lieutenant Governor, Gavin Newsom, at the latest Ericsson and AT&T hosted FutureCast event held at the AT&T Foundry in Palo Alto. Newsom, the author of “Citizenville,” a kind of digital manifesto for 21st century networked politics, didn’t beat around the bush.
“Yes,” Newsom replied, sounding more like a startup guy than a career politician. But then that’s what Newsom is. A serial entrepreneur who treats politics like a Silicon Valley startup, Newsom is about as unlike a traditional politician as anyone in California, particularly since he answers questions honestly. “Are you saying that government doesn’t work?” I asked the second most powerful state politician in California. “I’m saying technology and government doesn’t work–period, exclamation,” Newsom shot back.”

Video: Should Politicians Be More Like Silicon Valley Entrepreneurs?

New Paper by Junqué de Fortuny, Enric, Martens, David, and Provost, Foster in Big Data :“With the increasingly widespread collection and processing of “big data,” there is natural interest in using these data assets to improve decision making. One of the best understood ways to use data to improve decision making is via predictive analytics. An important, open question is: to what extent do larger data actually lead to better predictive models? In this article we empirically demonstrate that when predictive models are built from sparse, fine-grained data—such as data on low-level human behavior—we continue to see marginal increases in predictive performance even to very large scale. The empirical results are based on data drawn from nine different predictive modeling applications, from book reviews to banking transactions. This study provides a clear illustration that larger data indeed can be more valuable assets for predictive analytics. This implies that institutions with larger data assets—plus the skill to take advantage of them—potentially can obtain substantial competitive advantage over institutions without such access or skill. Moreover, the results suggest that it is worthwhile for companies with access to such fine-grained data, in the context of a key predictive task, to gather both more data instances and more possible data features. As an additional contribution, we introduce an implementation of the multivariate Bernoulli Naïve Bayes algorithm that can scale to massive, sparse data.”

Predictive Modeling With Big Data: Is Bigger Really Better?

John Podesta at the White House blog: “Last Friday, the President spoke to the American people, and the international community, about how to keep us safe from terrorism in a changing world while upholding America’s commitment to liberty and privacy that our values and Constitution require. Our national security challenges are real, but that is surely not the only space where changes in technology are altering the landscape and challenging conceptions of privacy.
That’s why in his speech, the President asked me to lead a comprehensive review of the way that “big data” will affect the way we live and work; the relationship between government and citizens; and how public and private sectors can spur innovation and maximize the opportunities and free flow of this information while minimizing the risks to privacy. I will be joined in this effort by Secretary of Commerce Penny Pritzker, Secretary of Energy Ernie Moniz, the President’s Science Advisor John Holdren, the President’s Economic Advisor Gene Sperling and other senior government officials.
I would like to explain a little bit more about the review, its scope, and what you can expect over the next 90 days.
We are undergoing a revolution in the way that information about our purchases, our conversations, our social networks, our movements, and even our physical identities are collected, stored, analyzed and used. The immense volume, diversity and potential value of data will have profound implications for privacy, the economy, and public policy. The working group will consider all those issues, and specifically how the present and future state of these technologies might motivate changes in our policies across a range of sectors.
When we complete our work, we expect to deliver to the President a report that anticipates future technological trends and frames the key questions that the collection, availability, and use of “big data” raise – both for our government, and the nation as a whole. It will help identify technological changes to watch, whether those technological changes are addressed by the U.S.’s current policy framework and highlight where further government action, funding, research and consideration may be required.
This is going to be a collaborative effort. The President’s Council of Advisors on Science and Technology (PCAST) will conduct a study to explore in-depth the technological dimensions of the intersection of big data and privacy, which will feed into this broader effort. Our working group will consult with industry, civil liberties groups, technologists, privacy experts, international partners, and other national and local government officials on the significance of and future for these technologies. Finally, we will be working with a number of think tanks, academic institutions, and other organizations around the country as they convene stakeholders to discuss these very issues and questions. Likewise, many abroad are analyzing and responding to the challenge and seizing the opportunity of big data. These discussions will help to inform our study.
While we don’t expect to answer all these questions, or produce a comprehensive new policy in 90 days, we expect this work to serve as the foundation for a robust and forward-looking plan of action. Check back on this blog for updates on how you can get involved in the debate and for status updates on our progress.”

Big Data and the Future of Privacy

Joel Gurin in Information Week: “At the GovLab at New York University, where I am senior adviser, we’re taking a different approach than McKinsey’s to understand the evolving value of government open data: We’re studying open data companies from the ground up. I’m now leading the GovLab’s Open Data 500 project, funded by the John S. and James L. Knight Foundation, to identify and examine 500 American companies that use government open data as a key business resource.
Our preliminary results show that government open data is fueling companies both large and small, across the country, and in many sectors of the economy, including health, finance, education, energy, and more. But it’s not always easy to use this resource. Companies that use government open data tell us it is often incomplete, inaccurate, or trapped in hard-to-use systems and formats.
It will take a thorough and extended effort to make government data truly useful. Based on what we are hearing and the research I did for my book, here are some of the most important steps the federal government can take, starting now, to make it easier for companies to add economic value to the government’s data.
1. Improve data quality
The Open Data Policy not only directs federal agencies to release more open data; it also requires them to release information about data quality. Agencies will have to begin improving the quality of their data simply to avoid public embarrassment. We can hope and expect that they will do some data cleanup themselves, demand better data from the businesses they regulate, or use creative solutions like turning to crowdsourcing for help, as USAID did to improve geospatial data on its grantees.
 
 

2. Keep improving open data resources
The government has steadily made Data.gov, the central repository of federal open data, more accessible and useful, including a significant relaunch last week. To the agency’s credit, the GSA, which administers Data.gov, plans to keep working to make this key website still better. As part of implementing the Open Data Policy, the administration has also set up Project Open Data on GitHub, the world’s largest community for open-source software. These resources will be helpful for anyone working with open data either inside or outside of government. They need to be maintained and continually improved.
3. Pass DATA
The Digital Accountability and Transparency Act would bring transparency to federal government spending at an unprecedented level of detail. The Act has strong bipartisan support. It passed the House with only one dissenting vote and was unanimously approved by a Senate committee, but still needs full Senate approval and the President’s signature to become law. DATA is also supported by technology companies who see it as a source of new open data they can use in their businesses. Congress should move forward and pass DATA as the logical next step in the work that the Obama administration’s Open Data Policy has begun.
4. Reform the Freedom of Information Act
Since it was passed in 1966, the federal Freedom of Information Act has gone through two major revisions, both of which strengthened citizens’ ability to access many kinds of government data. It’s time for another step forward. Current legislative proposals would establish a centralized web portal for all federal FOIA requests, strengthen the FOIA ombudsman’s office, and require agencies to post more high-interest information online before they receive formal requests for it. These changes could make more information from FOIA requests available as open data.
5. Engage stakeholders in a genuine way
Up to now, the government’s release of open data has largely been a one-way affair: Agencies publish datasets that they hope will be useful without consulting the organizations and companies that want to use it. Other countries, including the UK, France, and Mexico, are building in feedback loops from data users to government data providers, and the US should, too. The Open Data Policy calls for agencies to establish points of contact for public feedback. At the GovLab, we hope that the Open Data 500 will help move that process forward. Our research will provide a basis for new, productive dialogue between government agencies and the businesses that rely on them.
6. Keep using federal challenges to encourage innovation
The federal Challenge.gov website applies the best principles of crowdsourcing and collective intelligence. Agencies should use this approach extensively, and should pose challenges using the government’s open data resources to solve business, social, or scientific problems. Other approaches to citizen engagement, including federally sponsored hackathons and the White House Champions of Change program, can play a similar role.
Through the Open Data Policy and other initiatives, the Obama administration has set the right goals. Now it’s time to implement and move toward what US CTO Todd Park calls “data liberation.” Thousands of companies, organizations, and individuals will benefit.”

How Government Can Make Open Data Work

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday