Stefaan Verhulst
Linda Poon at CityLab: “Online review sites can tell you a lot about a city’s restaurant scene, and they can reveal a lot about the city itself, too.
Researchers at MIT recently found that information about restaurants gathered from popular review sites can be used to uncover a number of socioeconomic factors of a neighborhood, including its employment rates and demographic profiles of the people who live, work, and travel there.
A report published last week in the Proceedings of the National Academy of Sciences explains how the researchers used information found on Dianping—a Yelp-like site in China—to find information that might usually be gleaned from an official government census. The model could prove especially useful for gathering information about cities that don’t have that kind of reliable or up-to-date government data, especially in developing countries with limited resources to conduct regular surveys….
Zheng and her colleagues tested out their machine-learning model using restaurant data from nine Chinese cities of various sizes—from crowded ones like Beijing, with a population of more than 10 million, to smaller ones like Baoding, a city of fewer than 3 million people.
They pulled data from 630,000 restaurants listed on Dianping, including each business’s location, menu prices, opening day, and customer ratings. Then they ran it through a machine-learning model with official census data and with anonymous location and spending data gathered from cell phones and bank cards. By comparing the information, they were able to determine where the restaurant data reflected the other data they had about neighborhoods’ characteristics.
They found that the local restaurant scene can predict, with 95 percent accuracy, variations in a neighborhood’s daytime and nighttime populations, which are measured using mobile phone data. They can also predict, with 90 and 93 percent accuracy, respectively, the number of businesses and the volume of consumer consumption. The type of cuisines offered and kind of eateries available (coffeeshop vs. traditional teahouses, for example), can also predict the proportion of immigrants or age and income breakdown of residents. The predictions are more accurate for neighborhoods near urban centers as opposed to those near suburbs, and for smaller cities, where neighborhoods don’t vary as widely as those in bigger metropolises….(More)”.
Carlos María Galmarini at Open Mind: “Modern medicine is based upon the work of Hippocrates and his disciples and is compiled in more than 70 books comprising the Hippocratic body of work. In its essence, these writings declare that any illness originates with natural causes. Therefore, medicine must be based on detailed observation, reason, and experience in order to establish a diagnosis, prognosis, and treatment. The Hippocratic tradition stresses the importance of the symptoms and the clinical exam. As a result, medicine abandoned superstition and the magic performed by priest-doctors, and it was transformed into a real, experience-based science….
A complementary combination of both intelligences (human and artificial) could help overcome the other’s shortcomings and limitations. As we incorporate intelligent technologies into medical processes, a new, more powerful form of collaboration will emerge. Analogous to the past when the automation of human tasks completely changed the known world and ignited an evolution in the offering of products and services, the combination of human and artificial intelligence will create a new type of collective intelligence capable of building more efficient organizations, and in the healthcare industry, it will be able to solve problems that until now have been unfathomable to the human mind alone.
Finally, it is worth remembering that fact based sciences are divided into natural and human disciplines. Medicine occupies a special place, straddling both. It can be difficult to establish the similarities between a doctor who works, for example, with rules defined by specific clinical trials and a traditional family practitioner. The former would be more related to a natural science, and the latter with a more human science – “the art of medicine.”
The combination of human and artificial intelligence in a new type of collective intelligence will enable doctors themselves to be a combination of the two. In other words, the art of medicine – human science – based on the analysis of big data – natural science. A new collective intelligence working on behalf of a wiser medicine….(More)”.
Book edited by Stan McClellan: “This book explores categories of applications and driving factors surrounding the Smart City phenomenon. The contributing authors provide perspective on the Smart Cities, covering numerous applications and classes of applications. The book uses a top-down exploration of the driving factors in Smart Cities, by including focal areas including “Smart Healthcare,” “Public Safety & Policy Issues,” and “Science, Technology, & Innovation.” Contributors have direct and substantive experience with important aspects of Smart Cities and discuss issues with technologies & standards, roadblocks to implementation, innovations that create new opportunities, and other factors relevant to emerging Smart City infrastructures….(More)”.
Paper by Sjoerd Romme and Albert Meijer: “There is increasing debate about the role that public policy research can play in identifying solutions to complex policy challenges. Most studies focus on describing and explaining how governance systems operate. However, some scholars argue that because current institutions are often not up to the task, researchers need to rethink this ‘bystander’ approach and engage in experimentation and interventions that can help to change and improve governance systems.
This paper contributes to this discourse by developing a design science framework that integrates retrospective research (scientific validation) and prospective research (creative design). It illustrates the merits and challenges of doing this through two case studies in the Netherlands and concludes that a design science framework provides a way of integrating traditional validation-oriented research with intervention-oriented design approaches. We argue that working at the interface between them will create new opportunities for these complementary modes of public policy research to achieve impact….(More)”
Interim Report by the Centre for Data Ethics and Innovation (UK): The use of algorithms has the potential to improve the quality of decision- making by increasing the speed and accuracy with which decisions are made. If designed well, they can reduce human bias in decision-making processes. However, as the volume and variety of data used to inform decisions increases, and the algorithms used to interpret the data become more complex, concerns are growing that without proper oversight, algorithms risk entrenching and potentially worsening bias.
The way in which decisions are made, the potential biases which they are subject to and the impact these decisions have on individuals are highly context dependent. Our Review focuses on exploring bias in four key sectors: policing, financial services, recruitment and local government. These have been selected because they all involve significant decisions being made about individuals, there is evidence of the growing uptake of machine learning algorithms in the sectors and there is evidence of historic bias in decision-making within these sectors. This Review seeks to answer three sets of questions:
- Data: Do organisations and regulators have access to the data they require to adequately identify and mitigate bias?
- Tools and techniques: What statistical and technical solutions are available now or will be required in future to identify and mitigate bias and which represent best practice?
- Governance: Who should be responsible for governing, auditing and assuring these algorithmic decision-making systems?
Our work to date has led to some emerging insights that respond to these three sets of questions and will guide our subsequent work….(More)”.
Book by Valesca Lima: “This book discusses the issues of citizen rights, governance and political crisis in Brazil. The project has a focus on “citizenship in times of crisis,” i.e., seeking to understand how citizenship rights have changed since the Brazilian political and economic crisis that started in 2014. Building on theories of citizenship and governance, the author examines policy-based evidence on the retractions of participatory rights, which are consequence of a stagnant economic scenario and the re-organization of conservative sectors. This work will appeal to scholarly audiences interested in citizenship, Brazilian politics, and Latin American policy and governance….(More)”.
Katherine R. Knobloch at Democratic Audit: “Both scholars and citizens have begun to believe that democracy is in decline. Authoritarian power grabs, polarising rhetoric, and increasing inequality can all claim responsibility for democratic systems that feel broken. Democracy depends on a polity who believe that their engagement matters, but evidence suggests democratic institutions have become unresponsive to the will of the public. How can we restore faith in self-government when both research and personal experience tell us that the public is losing power, not gaining it?
Deliberative public engagement
Deliberative democracy offers one solution, and it’s slowly shifting how the public engages in political decision-making. In Oregon, the Citizens’ Initiative Review(CIR) asks a group of randomly selected voters to carefully study public issues and then make policy recommendations based on their collective experience and insight. In Ireland, Citizens’ Assemblies are being used to amend the country’s constitution to better reflect changing cultural norms. In communities across the world, Participatory Budgeting is giving the public control over local government spending. Far from squashing democratic power, these deliberative institutions bolster it. They exemplify a new wave in democratic government, one that aims to bring community members together across political and cultural divides to make decisions about how to govern themselves.
Though the contours of deliberative events vary, most share key characteristics. A diverse body of community members gather together to learn from experts and one another, think through the short- and long-term implications of different policy positions, and discuss how issues affect not only themselves but their wider communities. At the end of those conversations, they make decisions that are representative of the diversity of participants and their ideas and which have been tested through collective reasoning….(More)”.
Introduction to Special Issue of International Organization by
Judith G. Kelley and Beth A. Simmons: “In recent decades, IGOs, NGOs, private firms and even states have begun to regularly package and distribute information on the relative performance of states. From the World Bank’s Ease of Doing Business Index to the Financial Action Task Force blacklist, global performance indicators (GPIs) are increasingly deployed to influence governance globally. We argue that GPIs derive influence from their ability to frame issues, extend the authority of the creator, and — most importantly — to invoke recurrent comparison that stimulates governments’ concerns for their own and their country’s reputation. Their public and ongoing ratings and rankings of states are particularly adept at capturing attention not only at elite policy levels but also among other domestic and transnational actors. GPIs thus raise new questions for research on politics and governance globally. What are the social and political effects of this form of information on discourse, policies and behavior? What types of actors can effectively wield GPIs and on what types of issues? In this symposium introduction, we define GPIs, describe their rise, and theorize and discuss these questions in light of the findings of the symposium contributions…(More)”.
M. P. J. Ashby in Research Data Journal for the Humanities and Social Sciences: “The study of spatial and temporal crime patterns is important for both academic understanding of crime-generating processes and for policies aimed at reducing crime. However, studying crime and place is often made more difficult by restrictions on access to appropriate crime data. This means understanding of many spatio-temporal crime patterns are limited to data from a single geographic setting, and there are few attempts at replication. This article introduces the Crime Open Database (code), a database of 16 million offenses from 10 of the largest United States cities over 11 years and more than 60 offense types. Open crime data were obtained from each city, having been published in multiple incompatible formats. The data were processed to harmonize geographic co-ordinates, dates and times, offense categories and location types, as well as adding census and other geographic identifiers. The resulting database allows the wider study of spatio-temporal patterns of crime across multiple US cities, allowing greater understanding of variations in the relationships between crime and place across different settings, as well as facilitating replication of research….(More)”.
Paper by Teresa Scassa and Merlynda Vilain: “The collection of vast quantities of personal data from embedded sensors is increasingly an aspect of urban life. This type of data collection is a feature of so-called smart cities, and it raises important questions about data governance. This is particularly the case where the data may be made available for reuse by others and for a variety of purposes.
This paper focuses on the governance of data captured through “smart” technologies and uses Ontario’s smart metering program as a case study. Ontario rolled out mandatory smart metering for electrical consumption in the early 2000s largely to meet energy conservation goals. In doing so, it designed a centralized data governance system overseen by the Smart Metering Entity to manage smart meter data and to protect consumer privacy. As interest in access to the data grew among third parties, and as new potential applications for the data emerged, the regulator sought to develop a model for data sharing that would protect privacy in relation to these new uses and that would avoid uses that might harm the public interest…(More)”.