Enriching Urban Spaces with Ambient Computing, the Internet of Things, and Smart City Design


Book by Shin’ichi Konomi and George Roussos: “In recent years, the presence of ubiquitous computing has increasingly integrated into the lives of people in modern society. As these technologies become more pervasive, new opportunities open for making citizens’ environments more comfortable, convenient, and efficient.

Enriching Urban Spaces with Ambient Computing, the Internet of Things, and Smart City Design is a pivotal reference source for the latest scholarly material on the interaction between people and computing systems in contemporary society, showcasing how ubiquitous computing influences and shapes urban environments. Highlighting the impacts of these emerging technologies from an interdisciplinary perspective, this book is ideally designed for professionals, researchers, academicians, and practitioners interested in the influential state of pervasive computing within urban contexts….(Table of Contents and List of Contributors)”.

Making Open Data more evidence-based


Essay by Stefaan G. Verhulst and Danny Lämmerhirt: “…To realize its potential there is a need for more evidence on the full life cycle of open data – within and across settings and sectors….

In particular, three substantive areas were identified that could benefit from interdisciplinary and comparative research:

Demand and use: First, many expressed a need to become smarter about the demand and use-side of open data. Much of the focus, given the nascent nature of many initiatives around the world, has been on the supply-side of open data. Yet to be more responsive and sustainable more insight needs to be gained to the demand and/or user needs.

Conversations repeatedly emphasized that we should differentiate between open data demand and use. Open data demand and use can be analyzed from multiple directions: 1) top-down, starting from a data provider, to intermediaries, to the end users and/or audiences; or 2) bottom-up, studying the data demands articulated by individuals (for instance, through FOIA requests), and how these demands can be taken up by intermediaries and open data providers to change what is being provided as open data.

Research should scrutinize each stage (provision, intermediation, use and demand) on its own, but also examine the interactions between stages (for instance, how may open data demand inform data supply, and how does data supply influence intermediation and use?)….

Informing data supply and infrastructure: Second, we heard on numerous occasions, a call upon researchers and domain experts to help in identifying “key data” and inform the government data infrastructure needed to provide them. Principle 1 of the International Open Data Charter states that governments should provide key data “open by default”, yet the questions remains in how to identify “key” data (e.g., would that mean data relevant to society at large?).

Which governments (and other public institutions) should be expected to provide key data and which information do we need to better understand government’s role in providing key data? How can we evaluate progress around publishing these data coherently if countries organize the capture, collection, and publication of this data differently?…

Impact: In addition to those two focus areas – covering the supply and demand side –  there was also a call to become more sophisticated about impact. Too often impact gets confused with outputs, or even activities. Given the embryonic and iterative nature of many open data efforts, signals of impact are limited and often preliminary. In addition, different types of impact (such as enhancing transparency versus generating innovation and economic growth) require different indicators and methods. At the same time, to allow for regular evaluations of what works and why there is a need for common assessment methods that can generate comparative and directional insights….

Research Networking: Several researchers identified a need for better exchange and collaboration among the research community. This would allow to tackle the research questions and challenges listed above, as well as to identify gaps in existing knowledge, to develop common research methods and frameworks and to learn from each other. Key questions posed involved: how to nurture and facilitate networking among researchers and (topical) experts from different disciplines, focusing on different issues or using different methods? How are different sub-networks related or disconnected with each other (for instance how connected are the data4development; freedom of information or civic tech research communities)? In addition, an interesting discussion emerged around how researchers can also network more with those part of the respective universe of analysis – potentially generating some kind of participatory research design….(More)”

Crowdsourcing Gun Violence Research


Penn Engineering: “Gun violence is often described as an epidemic, but as visible and shocking as shooting incidents are, epidemiologists who study that particular source of mortality have a hard time tracking them. The Centers for Disease Control is prohibited by federal law from conducting gun violence research, so there is little in the way of centralized infrastructure to monitor where, how,when, why and to whom shootings occur.

Chris Callison-Burch, Aravind K.Joshi Term Assistant Professor in Computer and InformationScience, and graduate studentEllie Pavlick are working to solve this problem.

They have developed the GunViolence Database, which combines machine learning and crowdsourcing techniques to produce a national registry of shooting incidents. Callison-Burch and Pavlick’s algorithm scans thousands of articles from local newspaper and television stations,determines which are about gun violence, then asks everyday people to pullout vital statistics from those articles, compiling that information into a unified, open database.

For natural language processing experts like Callison-Burch and Pavlick, the most exciting prospect of this effort is that it is training computer systems to do this kind of analysis automatically. They recently presented their work on that front at Bloomberg’s Data for Good Exchange conference.

The Gun Violence Database project started in 2014, when it became the centerpiece of Callison-Burch’s “Crowdsourcing and Human Computation”class. There, Pavlick developed a series of homework assignments that challenged undergraduates to develop a classifier that could tell whether a given news article was about a shooting incident.

“It allowed us to teach the things we want students to learn about datascience and natural language processing, while giving them the motivation to do a project that could contribute to the greater good,” says Callison-Burch.

The articles students used to train their classifiers were sourced from “TheGun Report,” a daily blog from New York Times reporters that attempted to catalog shootings from around the country in the wake of the Sandy Hook massacre. Realizing that their algorithmic approach could be scaled up to automate what the Times’ reporters were attempting, the researchers began exploring how such a database could work. They consulted with DouglasWiebe, a Associate Professor of Epidemiology in Biostatistics andEpidemiology in the Perelman School of Medicine, to learn more about what kind of information public health researchers needed to better study gun violence on a societal scale.

From there, the researchers enlisted people to annotate the articles their classifier found, connecting with them through Mechanical Turk, Amazon’scrowdsourcing platform, and their own website, http://gun-violence.org/…(More)”

Reframing Data Transparency


“Recently, the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP, a privacy and information policy think tank based in Brussels, London and Washington, D.C., and Telefónica, one of the largest telecommunications company in the world, issued a joint white paper on Reframing Data Transparency (the “white paper”). The white paper was the outcome of a June 2016 roundtable held by the two organizations in London, in which senior business leaders, Data Privacy Officers, lawyers and academics discussed the importance of user-centric transparency to the data driven economy….The issues explored during the roundtable and in the white paper include the following:

  • The transparency deficit in the digital age. There is a growing gap between traditional, legal privacy notices and user-centric transparency that is capable of delivering understandable and actionable information concerning an organization’s data use policies and practices, including why it processes data, what the benefits are to individuals and society, how it protects the data and how users can manage and control the use of their data.
  • The impact of the transparency deficit. The transparency deficit undermines customer trust and customers’ ability to participate more effectively in the digital economy.
  • Challenges of delivering user-centric transparency. In a connected world where there may be no direct relationship between companies and their end users, both transparency and consent as a basis for processing are particularly challenging.
  • Transparency as a multistakeholder challenge. Transparency is not solely a legal issue, but a multistakeholder challenge, which requires engagement of regulators, companies, individuals, behavioral economists, social scientists, psychologists and user experience specialists.
  • The role of data protection authorities (“DPAs”). DPAs play a key role in promoting and incentivizing effective data transparency approaches and tools.
  • The role of companies. Data transparency is a critical business issue because transparency drives digital trust as well as business opportunities. Organizations must innovate on how to deliver user-centric transparency. Data driven companies must research and develop new approaches to transparency that explain the value exchange between customers and companies and the companies’ data practices, and create tools that enable their customers to exercise effective engagement and control.
  • The importance of empowering individuals. It is crucial to support and enhance individuals’ digital literacy, which includes an understanding of the uses of personal data and the benefits of data processing, as well as knowledge of relevant privacy rights and the data management tools that are available to them. Government bodies, regulators and industry should be involved in educating the public regarding digital literacy. Such education should take place in schools and universities, and through consumer education campaigns. Transparency is the foundation and sine qua non of individual empowerment.
  • The role of behavioral economists, social scientists, psychologists and user experience specialists. Experts from these disciplines will be crucial in developing user-centric transparency and controls….(More)”.

Empowering cities


“The real story on how citizens and businesses are driving smart cities” by the Economist Intelligence Unit: “Digital technologies are the lifeblood of today’s cities. They are applied widely in industry and society, from information and communications technology (ICT) to the Internet of Things (IoT), in which objects are connected to the Internet. As sensors turn any object into part of an intelligent urban network, and as computing power facilitates analysis of the data these sensors collect, elected officials and city administrators can gain an unparalleled understanding of the infrastructure and services of their city. However, to make the most of this intelligence, another ingredient is essential: citizen engagement. Thanks to digital technologies, citizens can provide a steady flow of feedback and ideas to city officials.

This study by The Economist Intelligence Unit (EIU), supported by Philips Lighting, investigates how citizens and businesses in 12 diverse cities around the world—Barcelona, Berlin, Buenos Aires, Chicago, London, Los Angeles, Mexico City, New York City, Rio de Janeiro, Shanghai, Singapore and Toronto—envision the benefits of smart cities. The choices of the respondents to the survey reflect the diverse nature of the challenges and opportunities facing different cities, from older cities in mature markets, where technology is at work with infrastructure that may be centuries old, to new cities in emerging markets, which have the opportunity to incorporate digital technologies as they grow.

Coupled with expert perspectives, these insights paint a fresh picture of how digital technologies can empower people to contribute-giving city officials a roadmap to smart city life in the 21st century….(More)”

The Promise of Artificial Intelligence: 70 Real-World Examples


Report by the Information Technology & Innovation Foundation: “Artificial intelligence (AI) is on a winning streak. In 2005, five teams successfully completed the DARPA Grand Challenge, a competition held by the Defense Advanced Research Projects Agency to spur development of autonomous vehicles. In 2011, IBM’s Watson system beat out two longtime human champions to win Jeopardy! In 2016, Google DeepMind’s AlphaGo system defeated the 18-time world-champion Go player. And thanks to Apple’s Siri, Microsoft’s Cortana, Google’s Google Assistant, and Amazon’s Alexa, consumers now have easy access to a variety of AI-powered virtual assistants to help manage their daily lives. The potential uses of AI to identify patterns, learn from experience, and find novel solutions to new challenges continue to grow as the technology advances.

Moreover, AI is already having a major positive impact in many different sectors of the global economy and society.  For example, humanitarian organizations are using intelligent chatbots to provide psychological support to Syrian refugees, and doctors are using AI to develop personalized treatments for cancer patients. Unfortunately, the benefits of AI, as well as its likely impact in the years ahead, are vastly underappreciated by policymakers and the public. Moreover, a contrary narrative—that AI raises grave concerns and warrants a precautionary regulatory approach to limit the damages it could cause—has gained prominence, even though it is both wrong and harmful to societal progress.

To showcase the overwhelmingly positive impact of AI, this report describes the major uses of AI and highlights 70 real-world examples of how AI is already generating social and economic benefits. Policymakers should consider these benefits as they evaluate the steps they can take to support the development and adoption of AI….(More)”

We’ve stopped trusting institutions and started trusting strangers


TED: “Something profound is changing our concept of trust, says Rachel Botsman. While we used to place our trust in institutions like governments and banks, today we increasingly rely on others, often strangers, on platforms like Airbnb and Uber and through technologies like the blockchain. This new era of trust could bring with it a more transparent, inclusive and accountable society — if we get it right. Who do you trust?…(More)”

How technology can help nations navigate the difficult path to food sovereignty


 at The Conversation Global: “As the movement of people across the world creates more multicultural societies, can trade help communities maintain their identity? This is the question at the heart of a concept known as “food sovereignty”.

Food sovereignty has been defined as “the right of peoples to healthy and culturally appropriate food produced through ecologically sound and sustainable methods” and, critically, the ability of people to own their food systems.

Culturally appropriate food refers to the cuisine eaten by a certain group, which reflects their own values, norms, religion and preferences. It is usually dynamic and may change over time.

In my journey across different food landscapes, I have discovered that people consume food not just to satisfy hunger but for cultural, religious, and social reasons. And I have learnt that there are ways that international trade can help facilitate this….

Cultural groups have different definitions of good or appropriate food. The elite (who can afford it) and people who are environmentally conscious, for instance, believe in organic or local produce; Jews eat kosher food; and Muslims eat halal.

The challenge lies with making sure food is appropriately labelled – as organic, local, kosher or halal – and the key here is the authenticity of the certification process.

It can be quite difficult to trace the origin of certain foods, whether they’re produced locally or internationally. This educates consumers, allowing them to make the right choice. But it may be an additional cost for farmers, so there is little incentive to label.

The case for transparency and authentication

To ensure that trade allows people to have access to authentic and culturally appropriate food, I recommend a new, digitised process called “crypto-labelling”. Crypto-labelling would use secure communication technology to create a record which traces the history of a particular food from the farm to grocery stores. It would mean consistent records, no duplication, a certification registry, and easy traceability.

Crypto-labelling would ensure transparency in the certification process for niche markets, such as halal, kosher and organic. It allows people who don’t know or trust each other to develop a dependable relationship based on a particular commodity.

If somebody produces organic amaranth in Cotonou, Benin, for instance, and labels it with a digital code that anyone can easily understand, then a family in another country can have access to the desired food throughout the year.

This initiative, which should be based on the blockchain technology behind Bitcoin, can be managed by consumer or producer cooperatives. On the consumer end, all that’s required is a smartphone to scan and read the crypto-labels.

The adoption of blockchain technology in the agricultural sector can help African countries “leapfrog” to the fourth industrial revolution.

Leapfrogging happens when developing countries skip an already outmoded technology that’s widely used in the developed world and embrace a newer one instead. In the early 2000s, for instance, households with no landline became households with more than two mobile phones. This enabled the advent of a new platform for mobile banking in Kenya and Somalia.

Similarly, crypto-labelling will lead to a form of “electronic agriculture” which will make it cheaper in the long run to label and enhance traceability. With access to mobile technology increasing globally, it’s a feasible system for the developing world…(More)”

Innovando para una mejor gestión: La contribución de los laboratorios de innovación pública


Paper by Acevedo, Sebastián; and Dassen, Nicolás for IDB: “Los cambios tecnológicos, económicos y sociales de los últimos años exigen gobiernos capaces de adaptarse a nuevos desafíos y a las crecientes demandas de la ciudadanía. En muchos países y en distintos niveles de gobierno, esto ha llevado a la creación de laboratorios de innovación, unidades cuyo objetivo es promover de diversos modos la innovación en el sector público. En este trabajo se analizan los roles y desafíos de los laboratorios latinoamericanos, contrastándolos con buenas prácticas y características que la literatura ha asociado a mayores niveles de innovación en el sector público y en otras organizaciones.

A partir de una encuesta a directores de laboratorios y dos estudios de casos, se describe el panorama de los laboratorios latinoamericanos y se discuten sus desafíos para: i) trabajar sobre temas centrales de la gestión, ii) conseguir la adopción de innovaciones y el escalamiento de las mismas y iii) asegurar la sostenibilidad de estas.

En particular, hay cuatro factores clave para su desempeño en esos aspectos: dos factores político-institucionales –el apoyo del liderazgo y las redes de política– y dos factores metodológicos –la adecuación técnica de las innovaciones y la construcción de un significado compartido sobre ellas–.

Además, se identifican dos diferencias principales entre la mayoría de los laboratorios relevados aquí y la experiencia de otras regiones, descripta por la literatura existente: un foco más intenso en temas de gobierno abierto y menos actividades para el testeo controlado de innovaciones, como experimentos aleatorios y evaluaciones de impacto. Finalmente, se presentan conclusiones y recomendaciones para la consolidación de los laboratorios como canales efectivos para gestionar innovaciones, manejando los riesgos inherentes, y modernizar la gestión… (More Español)

When the Algorithm Itself is a Racist: Diagnosing Ethical Harm in the Basic Components of Software


Paper by Christian Sandvig et al in Special Issue of the International Journal of Communication on Automation, Algorithms, and Politics: “Computer algorithms organize and select information across a wide range of applications and industries, from search results to social media. Abuses of power by Internet platforms have led to calls for algorithm transparency and regulation. Algorithms have a particularly problematic history of processing information about race. Yet some analysts have warned that foundational computer algorithms are not useful subjects for ethical or normative analysis due to complexity, secrecy, technical character, or generality. We respond by investigating what it is an analyst needs to know to determine whether the algorithm in a computer system is improper, unethical, or illegal in itself. We argue that an “algorithmic ethics” can analyze a particular published algorithm. We explain the importance of developing a practical algorithmic ethics that addresses virtues, consequences, and norms: We increasingly delegate authority to algorithms, and they are fast becoming obscure but important elements of social structure…. (More)”