Smart cities could be lousy to live in if you have a disability


Elizabeth Woyke in MIT Technology Review: “People with disabilities affecting mobility, vision, hearing, and cognitive function often move to cities to take advantage of their comprehensive transit systems and social services. But US law doesn’t specify how municipalities should design and implement digital services for disabled people. As a result, cities sometimes adopt new technologies that can end up causing, rather than resolving, problems of accessibility.

Nowhere was this more evident than with New York City’s LinkNYC kiosks, which were installed on sidewalks in 2016 without including instructions in Braille or audible form. Shortly after they went in, the American Federation for the Blind sued the city. The suit was settled in 2017 and the kiosks have been updated, but Pineda says touch screens in general are still not fully accessible to people with disabilities.

Also problematic: the social-media-based apps that some municipal governments have started using to solicit feedback from residents. Blind and low-vision people typically can’t use the apps, and people over 65 are less likely to, says James Thurston, a vice president at the nonprofit G3ict, which promotes accessible information and communication technologies. “Cities may think they’re getting data from all their residents, but if those apps aren’t accessible, they’re leaving out the voices of large chunks of their population,” he says….

Even for city officials who have these issues on their minds, knowing where to begin can be difficult. Smart Cities for All, an initiative led by Thurston and Pineda, aims to help by providing free, downloadable tools that cities can use to analyze their technology and find more accessible options. One is a database of hundreds of pre-vetted products and services. Among the entries are Cyclomedia, which uses lidar data to determine when city sidewalks need maintenance, and ZenCity, a data analytics platform that uses AI to gauge what people are saying about a city’s level of accessibility. 

This month, the group will kick off a project working with officials in Chicago to grade the city on how well it supports people with disabilities. One key part of the project will be ensuring the accessibility of a new 311 phone system being introduced as a general portal to city services. The group has plans to expand to several other US cities this year, but its ultimate aim is to turn the work into a global movement. It’s met with governments in India and Brazil as well as Sidewalk Labs, the Alphabet subsidiary that is developing a smart neighborhood in Toronto….(More)”.

IBM aims to use crowdsourced sensor data to improve local weather forecasting globally


Larry Dignan at ZDN: “IBM is hoping that mobile barometric sensors from individuals opting in, supercomputing ,and the Internet of Things can make weather forecasting more local globally.

Big Blue, which owns The Weather Company, will outline the IBM Global High-Resolution Atmospheric Forecasting System (GRAF). GRAF incorporates IoT data in its weather models via crowdsourcing.

While hyper local weather forecasts are available in the US, Japan, and some parts of Western Europe, many regions in the world lack an accurate picture of weather.

Mary Glackin, senior vice president of The Weather Company, said the company is “trying to fill in the blanks.” She added, “In a place like India, weather stations are kilometers away. We think this can be as significant as bringing satellite data into models.”

For instance, the developing world gets forecasts based on global data that are updated every 6 hours and resolutions at 10km to 15km. By using GRAF, IBM said it can offer forecasts for the day ahead that are updated hourly on average and have a 3km resolution….(More)”.

Index: Open Data


By Alexandra Shaw, Michelle Winowatan, Andrew Young, and Stefaan Verhulst

The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on open data and was originally published in 2018.

Value and Impact

  • The projected year at which all 28+ EU member countries will have a fully operating open data portal: 2020

  • Between 2016 and 2020, the market size of open data in Europe is expected to increase by 36.9%, and reach this value by 2020: EUR 75.7 billion

Public Views on and Use of Open Government Data

  • Number of Americans who do not trust the federal government or social media sites to protect their data: Approximately 50%

  • Key findings from The Economist Intelligence Unit report on Open Government Data Demand:

    • Percentage of respondents who say the key reason why governments open up their data is to create greater trust between the government and citizens: 70%

    • Percentage of respondents who say OGD plays an important role in improving lives of citizens: 78%

    • Percentage of respondents who say OGD helps with daily decision making especially for transportation, education, environment: 53%

    • Percentage of respondents who cite lack of awareness about OGD and its potential use and benefits as the greatest barrier to usage: 50%

    • Percentage of respondents who say they lack access to usable and relevant data: 31%

    • Percentage of respondents who think they don’t have sufficient technical skills to use open government data: 25%

    • Percentage of respondents who feel the number of OGD apps available is insufficient, indicating an opportunity for app developers: 20%

    • Percentage of respondents who say OGD has the potential to generate economic value and new business opportunity: 61%

    • Percentage of respondents who say they don’t trust governments to keep data safe, protected, and anonymized: 19%

Efforts and Involvement

  • Time that’s passed since open government advocates convened to create a set of principles for open government data – the instance that started the open data government movement: 10 years

  • Countries participating in the Open Government Partnership today: 79 OGP participating countries and 20 subnational governments

  • Percentage of “open data readiness” in Europe according to European Data Portal: 72%

    • Open data readiness consists of four indicators which are presence of policy, national coordination, licensing norms, and use of data.

  • Number of U.S. cities with Open Data portals: 27

  • Number of governments who have adopted the International Open Data Charter: 62

  • Number of non-state organizations endorsing the International Open Data Charter: 57

  • Number of countries analyzed by the Open Data Index: 94

  • Number of Latin American countries that do not have open data portals as of 2017: 4 total – Belize, Guatemala, Honduras and Nicaragua

  • Number of cities participating in the Open Data Census: 39

Demand for Open Data

  • Open data demand measured by frequency of open government data use according to The Economist Intelligence Unit report:

    • Australia

      • Monthly: 15% of respondents

      • Quarterly: 22% of respondents

      • Annually: 10% of respondents

    • Finland

      • Monthly: 28% of respondents

      • Quarterly: 18% of respondents

      • Annually: 20% of respondents

    •  France

      • Monthly: 27% of respondents

      • Quarterly: 17% of respondents

      • Annually: 19% of respondents

        •  
    • India

      • Monthly: 29% of respondents

      • Quarterly: 20% of respondents

      • Annually: 10% of respondents

    • Singapore

      • Monthly: 28% of respondents

      • Quarterly: 15% of respondents

      • Annually: 17% of respondents 

    • UK

      • Monthly: 23% of respondents

      • Quarterly: 21% of respondents

      • Annually: 15% of respondents

    • US

      • Monthly: 16% of respondents

      • Quarterly: 15% of respondents

      • Annually: 20% of respondents

  • Number of FOIA requests received in the US for fiscal year 2017: 818,271

  • Number of FOIA request processed in the US for fiscal year 2017: 823,222

  • Distribution of FOIA requests in 2017 among top 5 agencies with highest number of request:

    • DHS: 45%

    • DOJ: 10%

    • NARA: 7%

    • DOD: 7%

    • HHS: 4%

Examining Datasets

  • Country with highest index score according to ODB Leaders Edition: Canada (76 out of 100)

  • Country with lowest index score according to ODB Leaders Edition: Sierra Leone (22 out of 100)

  • Number of datasets open in the top 30 governments according to ODB Leaders Edition: Fewer than 1 in 5

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition: 19%

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition by sector/subject:

    • Budget: 30%

    • Companies: 13%

    • Contracts: 27%

    • Crime: 17%

    • Education: 13%

    • Elections: 17%

    • Environment: 20%

    • Health: 17%

    • Land: 7%

    • Legislation: 13%

    • Maps: 20%

    • Spending: 13%

    • Statistics: 27%

    • Trade: 23%

    • Transport: 30%

  • Percentage of countries that release data on government spending according to ODB Leaders Edition: 13%

  • Percentage of government data that is updated at regular intervals according to ODB Leaders Edition: 74%

  • Number of datasets available through:

  • Number of datasets classed as “open” in 94 places worldwide analyzed by the Open Data Index: 11%

  • Percentage of open datasets in the Caribbean, according to Open Data Census: 7%

  • Number of companies whose data is available through OpenCorporates: 158,589,950

City Open Data

  • New York City

  • Singapore

    • Number of datasets published in Singapore: 1,480

    • Percentage of datasets with standardized format: 35%

    • Percentage of datasets made as raw as possible: 25%

  • Barcelona

    • Number of datasets published in Barcelona: 443

    • Open data demand in Barcelona measured by:

      • Number of unique sessions in the month of September 2018: 5,401

    • Quality of datasets published in Barcelona according to Tim Berners Lee 5-star Open Data: 3 stars

  • London

    • Number of datasets published in London: 762

    • Number of data requests since October 2014: 325

  • Bandung

    • Number of datasets published in Bandung: 1,417

  • Buenos Aires

    • Number of datasets published in Buenos Aires: 216

  • Dubai

    • Number of datasets published in Dubai: 267

  • Melbourne

    • Number of datasets published in Melbourne: 199

Sources

  • About OGP, Open Government Partnership. 2018.  

Creating Smart Cities


Book edited by Claudio Coletta, Leighton Evans, Liam Heaphy, and Rob Kitchin: “In cities around the world, digital technologies are utilized to manage city services and infrastructures, to govern urban life, to solve urban issues and to drive local and regional economies. While “smart city” advocates are keen to promote the benefits of smart urbanism – increased efficiency, sustainability, resilience, competitiveness, safety and security – critics point to the negative effects, such as the production of technocratic governance, the corporatization of urban services, technological lock-ins, privacy harms and vulnerability to cyberattack.

This book, through a range of international case studies, suggests social, political and practical interventions that would enable more equitable and just smart cities, reaping the benefits of smart city initiatives while minimizing some of their perils.

Included are case studies from Ireland, the United States of America, Colombia, the Netherlands, Singapore, India and the United Kingdom. These chapters discuss a range of issues including political economy, citizenship, standards, testbedding, urban regeneration, ethics, surveillance, privacy and cybersecurity. This book will be of interest to urban policymakers, as well as researchers in Regional Studies and Urban Planning…(More)”.

What is the true value of data? New series on the return on investment of data interventions


Case studies prepared by Jessica Espey and Hayden Dahmm for  SDSN TReNDS: “But what is the ROI of investing in data for altruistic means–e.g., for sustainable development?

Today, we are launching a series of case studies to answer this question in collaboration with the Global Partnership on Sustainable Development Data. The ten examples we will profile range from earth observation data gathered via satellites to investments in national statistics systems, with costs from just a few hundred thousand dollars (US) per year to millions over decades.

The series includes efforts to revamp existing statistical systems. It also supports the growing movement to invest in less traditional approaches to data collection and analysis beyond statistical systems–such as through private sector data sources or emerging technologies enabled by the growth of the information and communications technology (ICT) sector.

Some highlights from the first five case studies–available now:

An SMS-based system called mTRAC, implemented in Uganda, has supported significant improvements in the country’s health system–including halving of response time to disease outbreaks and reducing medication stock-outs, the latter of which resulted in fewer malaria-related deaths.

NASA’s and the U.S. Geological Survey’s Landsat program–satellites that provide imagery known as earth observation data–is enabling discoveries and interventions across the science and health sectors, and provided an estimated worldwide economic benefit as high as US$2.19 billion as of 2011.

BudgIT, a civil society organization making budget data in Nigeria more accessible to citizens through machine-readable PDFs and complementary online/offline campaigns, is empowering citizens to partake in the federal budget process.

International nonprofit BRAC is ensuring mothers and infants in the slums of Bangladesh are not left behind through a data-informed intervention combining social mapping, local censuses, and real-time data sharing. BRAC estimates that from 2008 to 2017, 1,087 maternal deaths were averted out of the 2,476 deaths that would have been expected based on national statistics.

Atlantic City police are developing new approaches to their patrolling, community engagement, and other activities through risk modeling based on crime and other data, resulting in reductions in homicides and shooting injuries (26 percent) and robberies (37 percent) in just the first year of implementation….(More)”.

Google is using AI to predict floods in India and warn users


James Vincent at The Verge: “For years Google has warned users about natural disasters by incorporating alerts from government agencies like FEMA into apps like Maps and Search. Now, the company is making predictions of its own. As part of a partnership with the Central Water Commission of India, Google will now alert users in the country about impending floods. The service is only currently available in the Patna region, with the first alert going out earlier this month.

As Google’s engineering VP Yossi Matias outlines in a blog post, these predictions are being made using a combination of machine learning, rainfall records, and flood simulations.

“A variety of elements — from historical events, to river level readings, to the terrain and elevation of a specific area — feed into our models,” writes Matias. “With this information, we’ve created river flood forecasting models that can more accurately predict not only when and where a flood might occur, but the severity of the event as well.”

The US tech giant announced its partnership with the Central Water Commission back in June. The two organizations agreed to share technical expertise and data to work on the predictions, with the Commission calling the collaboration a “milestone in flood management and in mitigating the flood losses.” Such warnings are particularly important in India, where 20 percent of the world’s flood-related fatalities are estimated to occur….(More)”.

Satellite Images and Shadow Analysis: How The Times Verifies Eyewitness Videos


 Christoph Koettl at the New York Times: “Was a video of a chemical attack really filmed in Syria? What time of day did an airstrike happen? Which military unit was involved in a shooting in Afghanistan? Is this dramatic image of glowing clouds really showing wildfires in California?

These are some of the questions the video team at The New York Times has to answer when reviewing raw eyewitness videos, often posted to social media. It can be a highly challenging process, as misinformation shared through digital social networks is a serious problem for a modern-day newsroom. Visual information in the digital age is easy to manipulate, and even easier to spread.

What is thus required for conducting visual investigations based on social media content is a mix of traditional journalistic diligence and cutting-edge internet skills, as can be seen in our recent investigation into the chemical attack in Douma, Syria.

 The following provides some insight into our video verification process. It is not a comprehensive overview, but highlights some of our most trusted techniques and tools….(More)”.

Rohingya turn to blockchain to solve identity crisis


Skot Thayer and Alex Hern at the Guardian: “Rohingya refugees are turning to blockchain-type technology to help address one of their most existential threats: lack of officially-recognised identity.

Denied citizenship in their home country of Myanmar for decades, the Muslim minority was the target of a brutal campaign of violence by the military which culminated a year ago this week. A “clearance operation” led by Buddhist militia sent more than 700,000 Rohingya pouring over the border into Bangladesh, without passports or official ID.

The Myanmar government has since agreed to take the Rohingya back, but are refusing to grant them citizenship. Many Rohingya do not want to return and face life without a home or an identity. This growing crisis prompted Muhammad Noor and his team at the Rohingya Project to try to find a digital solution.

“Why does a centralised entity like a bank or government own my identity,” says Noor, a Rohingya community leader based in Kuala Lumpur. “Who are they to say if I am who I am?”

Using blockchain-based technology, Noor, is trialling the use of digital identity cards that aim to help Rohingya in Malaysia, Bangladesh and Saudi Arabia access services such as banking and education. The hope is that successful trials might lead to a system that can help the community across southeast Asia.

Under the scheme, a blockchain database is used to record individual digital IDs, which can then be issued to people once they have taken a test to verify that they are genuine Rohingya….

Blockchain-based initiatives, such as the Rohingya Project, could eventually allow people to build the network of relationships necessary to participate in the modern global economy and prevent second and third generation “invisible” people from slipping into poverty. It could also allow refugees to send money across borders, bypassing high transaction fees.

In Jordan’s Azraq refugee camp, the United Nations World Food Programme (WFP) is using blockchain and biometrics to help Syrian refugees to purchase groceries using a voucher system. This use of the technology allows the WFP to bypass bank fees.

But Al Rjula says privacy is still an issue. “The technology is maturing, yet implementation by startups and emerging tech companies is still lacking,” he says.

The involvement of a trendy technology such as blockchains can often be enough to secure the funding, attention and support that start-ups – whether for-profit or charitable – need to thrive. But companies such as Tykn still have to tackle plenty of the same issues as their old-fashioned database-using counterparts, from convincing governments and NGOs to use their services in the first place to working out how to make enough overhead to pay staff, while also dealing with the fickle issues of building on a cutting-edge platform.

Blockchain-based humanitarian initiatives will also need to reckon with the problem of accountability in their efforts to aid refugees and those trapped in the limbo of statelessness.

Dilek Genc, a PhD candidate at the University of Edinburgh who studies blockchain-type applications in humanitarian aid and development, saysif the aid community continues to push innovation using Silicon Valley’s creed of “fail fast and often,” and experiment on vulnerable peoples they will be fundamentally at odds with humanitarian principles and fail to address the political roots of issues facing refugees…(More)”.

Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life


An Overview of National AI Strategies


Medium Article by Tim Dutton: “The race to become the global leader in artificial intelligence (AI) has officially begun. In the past fifteen months, Canada, China, Denmark, the EU Commission, Finland, France, India, Italy, Japan, Mexico, the Nordic-Baltic region, Singapore, South Korea, Sweden, Taiwan, the UAE, and the UK have all released strategies to promote the use and development of AI. No two strategies are alike, with each focusing on different aspects of AI policy: scientific research, talent development, skills and education, public and private sector adoption, ethics and inclusion, standards and regulations, and data and digital infrastructure.

This article summarizes the key policies and goals of each strategy, as well as related policies and initiatives that have announced since the release of the initial strategies. It also includes countries that have announced their intention to develop a strategy or have related AI policies in place….(More)”.