Understanding the smart city Domain: A Literature Review


Paper by Leonidas G. Anthopoulos: “Smart Cities appeared in literature in late ‘90s and various approaches have been developed so far. Until today, smart city does not describe a city with particular attributes but it is used to describe different cases in urban spaces: web portals that virtualize cities or city guides; knowledge bases that address local needs; agglomerations with Information and Communication Technology (ICT) infrastructure that attract business relocation; metropolitan-wide ICT infrastructures that deliver e-services to the citizens; ubiquitous environments; and recently ICT infrastructure for ecological use. Researchers, practicians, businessmen and policy makers consider smart city from different perspectives and most of them agree on a model that measures urban economy, mobility, environment, living, people and governance. On the other hand, ICT and construction industries stress to capitalize smart city and a new market seems to be generated in this domain. This chapter aims to perform a literature review, discover and classify the particular schools of thought, universities and research centres as well as companies that deal with smart city domain and discover alternative approaches, models, architecture and frameworks with this regard….(More)

How does collaborative governance scale?


Paper by Ansell, Chris; and Torfing, Jacob in Policy & Politics: “Scale is an overlooked issue in the literature on interactive governance. This special issue investigates the challenges posed by the scale and scaling of network and collaborative forms of governance. Our original motivation arose from a concern about whether collaborative governance can scale up. As we learned more, our inquiry expanded to include the tensions inherent in collaboration across scales or at multiple scales and the issue of dynamically scaling collaboration to adapt to changing problems and demands. The diverse cases in this special issue explore these challenges in a range of concrete empirical domains than span the globe…(More)”

The Data Revolution


Review of Rob Kitchin’s The Data Revolution: Big Data, Open Data, Data Infrastructures & their Consequences by David Moats in Theory, Culture and Society: “…As an industry, academia is not immune to cycles of hype and fashion. Terms like ‘postmodernism’, ‘globalisation’, and ‘new media’ have each had their turn filling the top line of funding proposals. Although they are each grounded in tangible shifts, these terms become stretched and fudged to the point of becoming almost meaningless. Yet, they elicit strong, polarised reactions. For at least the past few years, ‘big data’ seems to be the buzzword, which elicits funding, as well as the ire of many in the social sciences and humanities.

Rob Kitchin’s book The Data Revolution is one of the first systematic attempts to strip back the hype surrounding our current data deluge and take stock of what is really going on. This is crucial because this hype is underpinned by very real societal change, threats to personal privacy and shifts in store for research methods. The book acts as a helpful wayfinding device in an unfamiliar terrain, which is still being reshaped, and is admirably written in a language relevant to social scientists, comprehensible to policy makers and accessible even to the less tech savvy among us.

The Data Revolution seems to present itself as the definitive account of this phenomena but in filling this role ends up adopting a somewhat diplomatic posture. Kitchin takes all the correct and reasonable stances on the matter and advocates all the right courses of action but he is not able to, in the context of this book, pursue these propositions fully. This review will attempt to tease out some of these latent potentials and how they might be pushed in future work, in particular the implications of the ‘performative’ character of both big data narratives and data infrastructures for social science research.

Kitchin’s book starts with the observation that ‘data’ is a misnomer – etymologically data should refer to phenomena in the world which can be abstracted, measured etc. as opposed to the representations and measurements themselves, which should by all rights be called ‘capta’. This is ironic because the worst offenders in what Kitchin calls “data boosterism” seem to conflate data with ‘reality’, unmooring data from its conditions of production and making relationship between the two given or natural.

As Kitchin notes, following Bowker (2005), ‘raw data’ is an oxymoron: data are not so much mined as produced and are necessarily framed technically, ethically, temporally, spatially and philosophically. This is the central thesis of the book, that data and data infrastructures are not neutral and technical but also social and political phenomena. For those at the critical end of research with data, this is a starting assumption, but one which not enough practitioners heed. Most of the book is thus an attempt to flesh out these rapidly expanding data infrastructures and their politics….

Kitchin is at his best when revealing the gap between the narratives and the reality of data analysis such as the fallacy of empiricism – the assertion that, given the granularity and completeness of big data sets and the availability of machine learning algorithms which identify patterns within data (with or without the supervision of human coders), data can “speak for themselves”. Kitchin reminds us that no data set is complete and even these out-of-the-box algorithms are underpinned by theories and assumptions in their creation, and require context specific knowledge to unpack their findings. Kitchin also rightly raises concerns about the limits of big data, that access and interoperability of data is not given and that these gaps and silences are also patterned (Twitter is biased as a sample towards middle class, white, tech savy people). Yet, this language of veracity and reliability seems to suggest that big data is being conceptualised in relation to traditional surveys, or that our population is still the nation state, when big data could helpfully force us to reimagine our analytic objects and truth conditions and more pressingly, our ethics (Rieder, 2013).

However, performativity may again complicate things. As Kitchin observes, supermarket loyalty cards do not just create data about shopping, they encourage particular sorts of shopping; when research subjects change their behaviour to cater to the metrics and surveillance apparatuses built into platforms like Facebook (Bucher, 2012), then these are no longer just data points representing the social, but partially constitutive of new forms of sociality (this is also true of other types of data as discussed by Savage (2010), but in perhaps less obvious ways). This might have implications for how we interpret data, the distribution between quantitative and qualitative approaches (Latour et al., 2012) or even more radical experiments (Wilkie et al., 2014). Kitchin is relatively cautious about proposing these sorts of possibilities, which is not the remit of the book, though it clearly leaves the door open…(More)”

Blood donors in Sweden get a text message whenever their blood saves someone’s life


Jon Stone at the Independent: “With blood donation rates in decline all over the developed world, Sweden’s blood service is enlisting new technology to help push back against shortages.

One new initiative, where donors are sent automatic text messages telling them when their blood has actually been used, has caught the public eye.

People who donate initially receive a ‘thank you’ text when they give blood, but they get another message when their blood makes it into somebody else’s veins.

“We are constantly trying to develop ways to express [donors’] importance,” Karolina Blom Wiberg, a communications manager at the Stockholm blood service told The Independent.

“We want to give them feed back on their effort, and we find this is a good way to do that.”

The service says the messages give donors more positive feedback about how they’ve helped their fellow citizens – which encourages them to donate again.

But the new policy has also been a hit on social media and has got people talking about blood donation amongst their friends….(More)”

The science prize that’s making waves


Gillian Tett at the Financial Times: “The Ocean Health XPrize reveals a new fashion among philanthropists’…There is another reason why the Ocean Health XPrize fascinates me: what it reveals about the new fashion among philanthropists for handing out big scientific prizes. The idea is not a new one: wealthy people and governments have been giving prizes for centuries. In 1714, for example, the British government passed the Longitude Act, establishing a board to offer reward money for innovation in navigation — the most money was won by John Harrison, a clockmaker who invented the marine chronometer.

But a fascinating shift has taken place in the prize-giving game. In previous decades, governments or philanthropists usually bestowed money to recognise past achievements, often in relation to the arts. In 2012, McKinsey, the management consultants, estimated that before 1991, 97 per cent of prize money was a “recognition” award — for example, the Nobel Prizes. Today, however, four-fifths of all prize money is “incentive” or “inducement” awards. This is because many philanthropists and government agencies have started staging competitions to spur innovation in different fields, particularly science.

The best known of these is the XPrize Foundation, initiated two decades ago by Peter Diamandis, the entrepreneur. The original award, the Ansari XPrize, offered $10m to the first privately financed team to put a vehicle into space. Since then, the XPrize has spread its wings into numerous different fields, including education and life sciences. Indeed, having given $30m in prize money so far, it has another $70m of competitions running, including the Google Lunar XPrize, which is offering $30m to land a privately funded robot on the moon.

McKinsey estimates that if you look across the field of prize-giving around the world, “total funds available from large prizes have more than tripled over the last decade to reach $350m”, while the “total prize sector could already be worth as much as $1bn to $2bn”. The Ocean Health XPrize, in other words, is barely a drop in the prize-giving ocean.

Is this a good thing? Not always, it might seem. As the prizes proliferate, they can sometimes overlap. The money being awarded tends — inevitably — to reflect the pet obsessions of philanthropists, rather than what scientists themselves would like to explore. And even the people running the prizes admit that these only work when there is a clear problem to be solved….(More)”

Who knew contracts could be so interesting?


 at Transparency International UK: “…Despite the UK Government’s lack of progress, it wouldn’t be completely unreasonable to ask “who actually publishes these things, anyway?” Well, back in 2011, when the UK Government committed to publish all new contracts and tenders over £10,000 in value, the Slovakian Government decided to publish more or less everything. Faced by mass protests over corruption in the public sector, their government committed to publishing almost all public sector contracts online (there are some exemptions). You can now browse through the details of a significant amount of government business via the country’s online portal (so long as you can read Slovak, of course).

Who actually reads these things?

According to research by Transparency International Slovakia, at least 11% of the Slovakian adult population have looked at a government contract since they were first published back in 2011. That’s around 480,000 people. Although some of these spent more time than others browsing through the documents in-depth, this is undeniably an astounding amount of people taking a vague interest in government procurement.

Why does this matter?

Before Slovakia opened-up its contracts there was widespread mistrust in public institutions and officials. According to Transparency International’s global Corruption Perceptions Index, which measures impressions of public sector corruption, Slovakia was ranked 66th out of 183 countries in 2011. By 2014 it had jumped 12 places – a record achievement – to 54th, which must in some part be due to the Government’s commitment to opening-up public contracts to greater scrutiny.

Since the contracts were published, there also seems to have been a spike in media reports on government tenders. This suggests there is greater scrutiny of public spending, which should hopefully translate into less wasted expenditure.

Elsewhere, proponents of open contracting have espoused other benefits, such as greater commitment by both parties to following the agreement and protecting against malign private interests. Similar projects inGeorgia have also turned clunky bureaucracies into efficient, data-savvy administrations. In short, there are quite a few reasons why more openness in public sector procurement is a good thing.

Despite these benefits, opponents cite a number of downsides, including the administrative costs of publishing contracts online and issues surrounding commercially sensitive information. However, TI Slovakia’s research suggests the former is minimal – and presumably preferable to rooting around through paper mountains every time a Freedom of Information (FOI) request is received about a contract – whilst the latter already has to be disclosed under the FOI Act except in particular circumstances…(More)”

Science to the people!


John Magan, at Digital Agenda for Europe:” …I attended the 2nd Barcelona Citizen Science Day organised as part of the city’s Science Festival. The programme was full and varied and in itself a great example of the wonderful world of do-it-yourself, hands-on, accessible, practical science. A huge variety of projects (see below) was delivered with enthusiasm, passion, and energy!

The day was rounded off with a presentation by Public Lab who showed how a bit of technical ingenuity like cheap cameras on kites and balloons can be used to keep governments and large businesses more honest and accountable – for example, data they collected is being used in court cases against BP for the Deepwater Horizon oil spill in the Gulf of Mexico.

But what was most striking is the empowerment that these Citizen Science projects give individuals to do things for themselves – to take measures to monitor, protect or improve their urban or rural environment; to indulge their curiosity or passions; to improve their finances; to work with others; to do good while having serious fun….If you want to have a deeper look, here are some of the many projects presented on a great variety of themes:

Water

Wildlife

Climate

Arts

Public health

Human

A nice booklet capturing them is available and there’s aslo a summary in Catalan only.

Read more about citizen science in the European Commission….(More)”

Legisletters: A Hub for Congressional Correspondence


Daniel Schuman at Congressional Data Coalition: “…GovLab beta launched a new tool, Legisletters, which automatically gathers congressional correspondence with agencies and publishes it in a searchable, user-friendly interface….Members of Congress have a hard time tracking their correspondence with federal agencies, in part because of staff turnover and the absence of an inexpensive, easy-to-use tool. It is very hard for an office can be aware of the letters that other offices send. Frequent staff turnover means current staff often have no idea of what was sent in the past.

Fortunately, since members of Congress often publish their correspondence on their websites–often in the less-than-helpful PDF format–it is possible to reconstruct some of the communications….Legisletters can help address several problems. It can serve as:

  • An archive of correspondence by individual members of Congress to agencies, which is very useful for current staff and historians alike.
  • A finding aid for other offices interested in partnering on issues, perhaps incorporated into a tool like the nascent “coalition builder.”
  • A data source for an alerting tool, like Scout, so journalists and advocates can keep an eye on what a particular office is doing.

In addition, the underlying technology can be repurposed to gather other documents published on the web, such as CRS reports….

feedback here. Let them know what you think.”

How a Mexico City Traffic Experiment Connects to Community Trust


Zoe Mendelson in Next Cities: “Last November, Gómez-Mont, Jose Castillo, an urban planning professor at Harvard’s Graduate School of Design, and Carlos Gershenson, their data analyst, won the Audi Urban Future award for their plan to use big data to solve Mexico City’s traffic problem. The plan consists of three parts, the first a data-donating platform that collects information on origin and destination, transit times, and modes of transit. The app, Living Mobs, is now in use in beta form. The plan also establishes data-sharing partnerships with companies, educational institutions and government agencies. So far, they’ve already signed on Yaxi, Microsoft, Movistar and Uber among others, and collected 14,000 datasets.

The data will be a welcome new resource for the city. “We just don’t have enough,” explains Gómez-Mont, “we call it ‘big city, little data.” The city’s last origin-destination survey conducted in 2007 only caught data from 50,000 people, which at the time was somewhat of a feat. Now, just one of their current data-sharing partners, Yaxi, has 10,000 cars circulating alone. Still, they have one major obstacle to a comprehensive citywide survey that can only be partially addressed by their data-donating platform (which also, of course, does depend on people having smartphones): 60 percent of transportation in Mexico City is on a hard-to-track informal bus system.

The data will eventually end up in an app that gives people real-time transit information. But an underlying idea — that traffic can be solved simply by asking people to take turns — is the project’s most radical and interesting component. Gómez-Mont paints a seductive alternative futuristic vision of incentivized negotiation of the city.

“Say I wake up and while getting ready for work I check and see that Périferico is packed and I say, ‘OK, today I’m going to use my bike or take public transit,’ and maybe I earn some kind of City Points, which translates into a tax break. Or maybe I’m on Périferico and earn points for getting off to relieve congestion.” She even envisions a system through which people could submit their calendar data weeks in advance. With the increasing popularity of Google Calendar and other similar systems that sync with smartphones, advanced “data donation” doesn’t seem that far-fetched.

Essentially, the app would create the opportunity for an entire city to behave as a group and solve its own problems together in real time.

Gómez-Mont insists that mobility is not just a problem for the government to solve. “It’s also very much about citizens and how we behave and what type of culture is embedded in the world outside of the government,” she notes….(More)”.

The Trust Imperative: A Framework for Ethical Data Use


New report by Susan Etlinger: “The way organizations use data use is affecting consumer trust, and that trust affects not just a brand’s reputation, but its business performance as well. As a result, chief executives who wish to sustain the trust of their customers and constituents must take a hard look at how their organizations collect and use customer data, and the effect of those practices on customer relationships, reputation, risk and revenue.

This report by Altimeter Group analyst Susan Etlinger lays out key drivers and principles for ethical data use. It discusses emerging best practices, and—most  importantly—a pragmatic framework that organizations can use to earn—and build—the trust of customers and consumers. This framework lists the questions that need to be asked at each stage of collecting and analyzing data, helping brands earn the trust of their customers, and safeguarding against both legal and ethical transgressions….(More)”