Chapter by Claire Borsenberger, Mathilde Hoang and Denis Joram: “Thanks to appropriate data algorithms, firms, especially those
The Lancet Countdown: Tracking progress on health and climate change using data from the International Energy Agency (IEA)
Victoria Moody at the UK Data Service: “The 2015 Lancet Commission on Health and Climate Change—which assessed responses to climate change with a view to ensuring the highest attainable standards of health for populations worldwide—concluded that “tackling climate change could be the greatest global health opportunity of the 21st century”. The Commission recommended that more accurate national quantification of the health co-benefits and economic impacts of mitigation decisions was essential in promoting a low-carbon transition.
Building on these foundations, the Lancet Countdown: tracking progress on health and climate change was formed as an independent research collaboration…
The partnership comprises 24 academic institutions from every continent, bringing together individuals with a broad range of expertise across disciplines (including climate scientists, ecologists, mathematicians, geographers, engineers, energy, food, and transport experts, economists, social and political scientists, public health professionals, and physicians).
Four of the indicators developed for Working Group 3 (Mitigation actions and health co-benefits) uses International Energy Agency (IEA) data made available by the the IEA via the UK Data Service for use by researchers, learners and teaching staff in UK higher and further education. Additionally, two of the indicators developed for Working Group 4 (Finance and economics) also use IEA data.
Read our impact case study to find
How the medium shapes the message: Printing and the rise of the arts and sciences
Paper by C. Jara-Figueroa, Amy Z. Yu, and César A. Hidalgo: “Communication technologies, from printing to social media, affect our historical records by changing the way ideas are spread and recorded. Yet, finding statistical evidence of this fact has been challenging. Here we combine a common causal inference technique (instrumental variable estimation) with a dataset on nearly forty thousand biographies from Wikipedia (Pantheon 2.0), to study the effect of the introduction of printing in European cities on Wikipedia’s digital biographical records.
By using a city’s distance to Mainz as an instrument for the adoption of the movable type press, we show that European cities that adopted printing earlier were more likely to become the birthplace of a famous scientist or artist during the years following the invention of printing. We bring these findings to recent communication technologies by showing that the number of radios and televisions in a country correlates with the number of globally famous performing artists and sports players born in that country, even after controlling for GDP, population, and including country and year fixed effects. These findings support the hypothesis that the introduction of communication technologies can bias historical records in the direction of the content that is best suited for each technology….(More)”.
Shutting down the internet doesn’t work – but governments keep doing it
George Ogola in The Conversation: “As the internet continues to gain considerable power and agency around the world, many governments have moved to regulate it. And where regulation fails, some states resort to internet shutdowns or deliberate disruptions.
The statistics are staggering. In India alone, there were 154 internet shutdowns between January 2016 and May 2018. This is the most of any country in the world.
But similar shutdowns are becoming common on the African continent. Already in 2019 there have been shutdowns in Cameroon, the Democratic Republic of Congo, Republic of Congo, Chad, Sudan and Zimbabwe. Last year there were 21 such shutdowns on the continent. This was the case in Togo, Sierra Leone, Sudan and Ethiopia, among others.
The justifications for such shutdowns are usually relatively predictable. Governments often claim that internet access is blocked in the interest of public security and order. In some instances, however, their reasoning borders on the curious if not downright absurd, like the case of Ethiopia in 2017 and Algeria in 2018 when the internet was shut down apparently to curb cheating in national examinations.
Whatever their reasons, governments have three general approaches to controlling citzens’ access to the web.
How they do it
Internet shutdowns or disruptions usually take three forms. The first and probably the most serious is where the state completely blocks access to the internet on all platforms. It’s arguably the most punitive, with significant social, economic and political costs.
The financial costs can run into millions of dollars for each day the internet is blocked. A Deloitte report on the issue estimates that a country with average connectivity could lose at least 1.9% of its daily GDP for each day all internet services are shut down.
For countries with average to medium level connectivity the loss is 1% of daily GDP, and for countries with average to low connectivity it’s 0.4%. It’s estimated that Ethiopia, for example, could lose up to US$500,000 a day whenever there is a shutdown. These shutdowns, then, damage businesses, discourage investments, and hinder economic growth.
The second way that governments restrict internet access is by applying content blocking techniques. They restrict access to particular sites or applications. This is the most common strategy and it’s usually targeted at social media platforms. The idea is to stop or limit conversations on these platforms.
Online spaces have become the platform for various forms of political expression that many states especially those with authoritarian leanings consider subversive. Governments argue, for example, that social media platforms encourage the spread of rumours which can trigger public unrest.
This was the case in 2016 in Uganda during the country’s presidential elections. The government restricted access to social media, describing the shutdown as a “security measure to avert lies … intended to incite violence and illegal declaration of election results”.
In Zimbabwe, the government blocked social media following demonstrations over an increase in fuel prices. It argued that the January 2019 ban was because the platforms were being “used to coordinate the violence”.
The third strategy, done almost by stealth, is the use of what is generally known as “bandwidth throttling”. In this case telecom operators or internet service providers are forced to lower the quality of their cell signals or internet speed. This makes the internet too slow to use. “Throttling” can also target particular online destinations such as social media sites
Nudging Citizens through Technology in Smart Cities
Sofia Ranchordas in the International Review of Law, Computers & Technology: “In the last decade, several smart cities throughout the world have started employing Internet of Things, big data, and algorithms to nudge citizens to save more water and energy, live healthily, use public transportation, and participate more actively in local affairs. Thus far, the potential and implications of data-driven nudges and behavioral insights in smart cities have remained an overlooked subject in the legal literature. Nevertheless, combining technology with behavioral insights may allow smart cities to nudge citizens more systematically and help these urban centers achieve their sustainability goals and promote civic engagement. For example, in Boston, real-time feedback on driving has increased road safety and in Eindhoven, light sensors have been used to successfully reduce nightlife crime and disturbance. While nudging tends to be well-intended, data-driven nudges raise a number of legal and ethical issues. This article offers a novel and interdisciplinary perspective on nudging which delves into the legal, ethical, and trust implications of collecting and processing large amounts of personal and impersonal data to influence citizens’ behavior in smart cities….(More)”.
Twentieth Century Town Halls: Architecture of Democracy
Book by Jon Stewart: “This is the first book to examine the development of the town hall during the twentieth century and the way in which these civic buildings have responded to the dramatic political, social and architectural changes which took place during the period. Following an overview of the history of the town hall as a building type, it examines the key themes, variations
Copenhagen Town Hall, Denmark, Martin Nyrop
Stockholm City Hall, Sweden, Ragnar Ostberg
Hilversum Town Hall, the Netherlands, Willem M. Dudok
Walthamstow Town Hall, Britain, Philip Dalton Hepworth
Oslo Town Hall, Norway, Arnstein Arneberg and Magnus Poulsson
Casa del Fascio, Como, Italy, Guiseppe Terragni
Aarhus Town Hall, Denmark, Arne Jacobsen with Eric Moller
Saynatsalo Town Hall, Finland, Alvar Aalto
Kurashiki City Hall, Japan, Kenzo Tange
Toronto City Hall, Canada, Viljo Revell
Boston City Hall, USA, Kallmann, McKinnell and Knowles
Dallas City Hall, USA, IM Pei
Mississauga City Hall, Canada, Ed Jones and Michael Kirkland
Borgoricco Town Hall, Italy, Aldo Rossi
Reykjavik City Hall, Iceland, Studio Granda
Valdelaguna Town Hall, Spain, Victor Lopez Cotelo and Carlos Puente Fernandez
The Hague City Hall, the Netherlands, Richard Meier
Iragna Town Hall, Switzerland, Raffaele Cavadini
Murcia City Hall, Spain, Jose Rafael Moneo
London City Hall, UK, Norman Foster…(More)”.
Claudette: an automated detector of potentially unfair clauses in online terms of service
Marco Lippi et al in AI and the Law Journal: “Terms of service of
Responsible AI for conservation
Oliver Wearn, RobinFreeman and David Jacoby in Nature: “Machine learning (ML) is revolutionizing efforts to conserve nature. ML algorithms are being applied to predict the extinction risk of thousands of species, assess the global footprint of fisheries, and identify animals and humans in wildlife sensor data recorded in the field. These efforts have recently been given a huge boost with support from the commercial sector. New initiatives, such as Microsoft’s AI for Earth and Google’s AI for Social Good, are bringing new resources and new ML tools to bear on some of the biggest challenges in conservation. In parallel to this, the open data revolution means that global-scale, conservation-relevant datasets can be fed directly to ML algorithms from open data repositories, such as Google Earth Engine for satellite data or Movebank for animal tracking data. Added to these will be Wildlife Insights, a Google-supported platform for hosting and
Setting Foundations for the Creation of Public Value in Smart Cities
Book edited by Manuel Pedro Rodriguez Bolivar: ” This book seeks to contribute to prior research facing the discussion about public value creation in Smart Cities and the role of governments. In the early 21st century, the rapid transition to a highly urbanized population has made societies and their governments around the world to be meeting unprecedented challenges regarding key themes such as sustainability, new governance models and the creation of networks.
Also, cities today face increasing challenges when it comes to providing advanced (digital) services to their constituency. The use of information and communication technologies (usually ICTs) and data is thought to rationalize and improve government and have the potential to transform governance and organizational issues. These questions link up to the ever-evolving concept of Smart Cities. In fact, the rise of the Smart City and Smart City thinking is a direct response to such challenges, as well as providing a means of integrating
Weather Service prepares to launch prediction model many forecasters don’t trust
Jason Samenow in the Washington Post: “In a month, the National Weather Service plans to launch its “next generation” weather prediction model with the aim of “better, more timely forecasts.” But many meteorologists familiar with the model fear it is unreliable.
The introduction of a model that forecasters lack confidence in matters, considering the enormous impact that weather has on the economy, valued at around $485 billion annually.
The Weather Service announced Wednesday that the model, known as the GFS-FV3 (FV3 stands for Finite Volume Cubed-Sphere dynamical core), is “tentatively” set to become the United States’ primary forecast model on March 20, pending tests. It is an update to the current version of the GFS (Global Forecast System), popularly known as the American model, which has existed in various forms for more than 30 years
A concern is that if forecasters cannot rely on the FV3, they will be left to rely only on the European model for their predictions without a credible alternative for comparisons. And they’ll also have to pay large fees for the European model data. Whereas model data from the Weather Service is free, the European Center for Medium-Range Weather Forecasts, which produces the European model, charges for access.
But there is an alternative perspective, which is that forecasters will just need to adjust to the new model and learn to account for its biases. That is, a little short-term pain is worth the long-term potential benefits as the model improves
The Weather Service’s parent agency, the National Oceanic and Atmospheric Administration, recently entered an agreement with the National Center for Atmospheric Research to increase collaboration between forecasters and researchers in improving forecast modeling.
In addition, President Trump recently signed into law the Weather Research and Forecast Innovation Act Reauthorization, which establishes the NOAA Earth Prediction Innovation Center, aimed at further enhancing prediction capabilities. But even while NOAA develops relationships and infrastructure to improve the Weather Service’s modeling, the question remains whether the FV3 can meet the forecasting needs of the moment. Until the problems identified are addressed, its introduction could represent a step back in U.S. weather prediction despite a well-intended effort to leap forward….(More).