What Can Satellite Imagery Tell Us About Obesity in Cities?


Emily Matchar at Smithsonian: “About 40 percent of American adults are obese, defined as having a body mass index (BMI) over 30. But obesity is not evenly distributed around the country. Some cities and states have far more obese residents than others. Why? Genetics, stress, income levels and access to healthy foods are play a role. But increasingly researchers are looking at the built environment—our cities—to understand why people are fatter in some places than in others.

New research from the University of Washington attempts to take this approach one step further by using satellite data to examine cityscapes. By using the satellite images in conjunction with obesity data, they hope to uncover which urban features might influence a city’s obesity rate.

The researchers used a deep learning network to analyze about 150,000 high-resolution satellite image of four cities: Los Angeles, Memphis, San Antonio and Seattle. The cities were selected for being from states with both high obesity rates (Texas and Tennessee) and low obesity rates (California and Washington). The network extracted features of the built environment: crosswalks, parks, gyms, bus stops, fast food restaurants—anything that might be relevant to health.

“If there’s no sidewalk you’re less likely to go out walking,” says Elaine Nsoesie, a professor of global health at the University of Washington who led the research.

The team’s algorithm could then see what features were more or less common in areas with greater and lesser rates of obesity. Some findings were predictable: more parks, gyms and green spaces were correlated with lower obesity rates. Others were surprising: more pet stores equaled thinner residents (“a high density of pet stores could indicate high pet ownership, which could influence how often people go to parks and take walks around the neighborhood,” the team hypothesized).

A paper on the results was recently published in the journal JAMA Network Open….(More)”.

Emerging Labour Market Data Sources towards Digital Technical and Vocational Education and Training (TVET)


Paper by Nikos Askitas, Rafik Mahjoubi, Pedro S. Martins, Koffi Zougbede for Paris21/OECD: “Experience from both technology and policy making shows that solutions for labour market improvements are simply choices of new, more tolerable problems. All data solutions supporting digital Technical and Vocational Education and Training (TVET) will have to incorporate a roadmap of changes rather than an unrealistic super-solution. The ideal situation is a world in which labour market participants engage in intelligent strategic behavior in an informed, fair and sophisticated manner.

Labour market data captures transactions within labour market processes. In order to successfully capture such data, we need to understand the specifics of these market processes. Designing an ecosystem of labour market matching facilitators and rules of engagement for contributing to a lean and streamlined Logistics Management and Information System (LMIS) is the best way to create Big Data with context relevance. This is in contrast with pre-existing Big Data captured by global job boards or social media for which relevance is limited by the technology access gap and its variations across the developing world.

Network effects occur in technology and job facilitation, as seen in the developed world. Managing and instigating the right network effects might be crucial to avoid fragmented stagnation and inefficiency. This is key to avoid throwing money behind wrong choices that do not gain traction.

A mixed mode approach is possibly the ideal approach for developing countries. Mixing offline and online elements correctly will be crucial in bridging the technology access gap and reaping the benefits of digitisation at the same time.

Properly incentivising the various entities is critical for progression, and more specifically the private sector, which is significantly more agile and inventive, has “skin in the game” and a long-term commitment to the conditions in the field, has intimate knowledge of how to solve the the technology gap and brings a better understanding of the particular ambient context they are operating in. To summarise: Big Data starts small.

Managing expectations and creating incentives for the various stakeholders will be crucial in establishing digitally supported TVET. Developing the right business models will be crucial in the short term and beyond, and it will be the result of creating the right mix of technological and policy expertise with good knowledge of the situation on the ground….(More)”.

Multistakeholder Governance and Democracy: A Global Challenge


Book by Harris Gleckman: “Multistakeholder governance is proposed as the way forward in global governance. For some leaders in civil society and government who are frustrated with the lack of power of the UN system and multilateralism it is seen as an attractive alternative; others, particularly in the corporate world, see multistakeholder governance as offering a more direct hand and potentially a legitimate role in national and global governance.

This book examines how the development of multistakeholderism poses a challenge to multilateralism and democracy. Using a theoretical, historical perspective it describes how the debate on global governance evolved and what working principles of multilateralism are under threat. From a sociological perspective, the book identifies the organizational beliefs of multistakeholder groups and the likely change in the roles that leaders in government, civil society, and the private sector will face as they evolve into potential global governors. From a practical perspective, the book addresses the governance issues which organizations and individuals should assess before deciding to participate in or support a particular multistakeholder group.

Given the current emphasis on the participation of multiple actors in the Sustainable Development Goals, this book will have wide appeal across policy-making and professional sectors involved in negotiations and governance at all levels. It will also be essential reading for students studying applied governance….(More)”.

Crowdsourced social media data for disaster management: Lessons from the PetaJakarta.org project


R.I.Ogie, R.J.Clarke, H.Forehead and P.Perez in Computers, Environment and Urban Systems: “The application of crowdsourced social media data in flood mapping and other disaster management initiatives is a burgeoning field of research, but not one that is without challenges. In identifying these challenges and in making appropriate recommendations for future direction, it is vital that we learn from the past by taking a constructively critical appraisal of highly-praised projects in this field, which through real-world implementations have pioneered the use of crowdsourced geospatial data in modern disaster management. These real-world applications represent natural experiments, each with myriads of lessons that cannot be easily gained from computer-confined simulations.

This paper reports on lessons learnt from a 3-year implementation of a highly-praised project- the PetaJakarta.org project. The lessons presented derive from the key success factors and the challenges associated with the PetaJakarta.org project. To contribute in addressing some of the identified challenges, desirable characteristics of future social media-based disaster mapping systems are discussed. It is envisaged that the lessons and insights shared in this study will prove invaluable within the broader context of designing socio-technical systems for crowdsourcing and harnessing disaster-related information….(More)”.

To turn the open data revolution from idea to reality, we need more evidence


Stefaan Verhulst at apolitical: “The idea that we are living in a data age — one characterised by unprecedented amounts of information with unprecedented potential — has  become mainstream. We regularly read “data is the new oil,” or “data is the most valuable commodity in the global economy.”

Doubtlessly, there is truth in these statements. But a major, often unacknowledged problem is how much data remains inaccessible, hidden in siloes and behind walls.

For close to a decade, the technology and public interest community has pushed the idea of open data. At its core, open data represents a new paradigm of information and information access.

Rooted in notions of an information commons — developed by scholars like Nobel Prize winner Elinor Ostrom — and borrowing from the language of open source, open data begins from the premise that data collected from the public, often using public funds or publicly funded infrastructure, should also belong to the public — or at least, be made broadly accessible to those pursuing public-interest goals.

The open data movement has reached significant milestones in its short history. An ever-increasing number of governments across both developed and developing economies have released large datasets for the public’s benefit….

Similarly, a growing number of private companies have “Data Collaboratives” leveraging their data — with various degrees of limitations — to serve the public interest.

Despite such initiatives, many open data projects (and data collaboratives) remain fledgling. The field has trouble scaling projects beyond initial pilots. In addition, many potential stakeholders — private sector and government “owners” of data, as well as public beneficiaries — remain sceptical of open data’s value. Such limitations need to be overcome if open data and its benefits are to spread. We need hard evidence of its impact.

Ironically, the field is held back by an absence of good data on open data — that is, a lack of reliable empirical evidence that could guide new initiatives.

At the GovLab, a do-tank at New York University, we study the impact of open data. One of our overarching conclusions is that we need a far more solid evidence base to move open data from being a good idea to reality.

What do we know? Several initiatives undertaken at the GovLab offer insight. Our ODImpactwebsite now includes more than 35 detailed case studies of open government data projects. These examples provide powerful evidence not only that open data can work but also about howit works….

We have also launched an Open Data Periodic Table to better understand what conditions predispose an open data project toward success or failure. For example, having a clear problem definition, as well as the capacity and culture to carry out open data projects, are vital. Successful projects also build cross-sector partnerships around open data and its potential uses and establish practices to assess and mitigate risks, and have transparent and responsive governance structures….(More)”.

Google is using AI to predict floods in India and warn users


James Vincent at The Verge: “For years Google has warned users about natural disasters by incorporating alerts from government agencies like FEMA into apps like Maps and Search. Now, the company is making predictions of its own. As part of a partnership with the Central Water Commission of India, Google will now alert users in the country about impending floods. The service is only currently available in the Patna region, with the first alert going out earlier this month.

As Google’s engineering VP Yossi Matias outlines in a blog post, these predictions are being made using a combination of machine learning, rainfall records, and flood simulations.

“A variety of elements — from historical events, to river level readings, to the terrain and elevation of a specific area — feed into our models,” writes Matias. “With this information, we’ve created river flood forecasting models that can more accurately predict not only when and where a flood might occur, but the severity of the event as well.”

The US tech giant announced its partnership with the Central Water Commission back in June. The two organizations agreed to share technical expertise and data to work on the predictions, with the Commission calling the collaboration a “milestone in flood management and in mitigating the flood losses.” Such warnings are particularly important in India, where 20 percent of the world’s flood-related fatalities are estimated to occur….(More)”.

Mission Failure


Matthew Sawh at Stanford Social Innovation Review: “Exposing the problems of policy schools can ignite new ways to realize the mission of educating public servants in the 21st century….

Public policy schools were founded with the aim to educate public servants with academic insights that could be applied to government administration. And while these programs have adapted the tools and vocabularies of the Reagan Revolution, such as the use of privatization and the rhetoric of competition, they have not come to terms with his philosophical legacy that describes our contemporary political culture. To do so, public policy schools need to acknowledge that the public perceives the government as the problem, not the solution, to society’s ills. Today, these programs need to ask how decisionmakers should improve the design of their organizations, their decision-making processes, and their curriculum in order to address the public’s skeptical mindset.

I recently attended a public policy school, Columbia University’s School of International and Public Affairs (SIPA), hoping to learn how to bridge the distrust between public servants and citizens, and to help forge bonds between bureaucracies and voters who feel ignored by their government officials. Instead of building bridges across these divides, the curriculum of my policy program reinforced them—training students to navigate bureaucratic silos in our democracy. Of course, public policy students go to work in the government we have, not the government we wish we had—but that’s the point. These schools should lead the national conversation and equip their graduates to think and act beyond the divides between the governing and the governed.

Most US public policy programs require a core set of courses, including macroeconomics, microeconomics, statistics, and organizational management. SIPA has broader requirements, including a financial management course, a client consulting workshop, and an internship. Both sets of core curricula undervalue the intrapersonal and interpersonal elements of leadership, particularly politics, which I define aspersuasion, particularly within groups and institutions.

Public service is more than developing smart ideas; it entails the ability to marshal the financial, political, and organizational supports to make those ideas resonate with the public and take effect in government policy. Unfortunately, these programs aren’t adequately training early career professionals to implement their ideas by giving short shrift to the intrapersonal and institutional contexts of real changemaking.

Within the core curriculum, the story of change is told as the product of processes wherein policymakers can know the rational expectations of the public. But the people themselves have concerns beyond those perceived by policymakers. As public servants, our success depends on our ability to meet people where they are, rather than where we suppose they should be.  …

Public policy schools must reach a consensus on core identity questions: Who is best placed to lead a policy school? What are their aims in crafting a professional class? What exactly should a policy degree mean in the wider world? The problem is that these programs are meant to teach students about not only the science of good government, but the human art of good governance.

Curricula based on an outdated sense both of the political process and of advocacy is a predominant feature of policy programs. Instead, core courses should cover how to advocate effectively in this new political world of the 21st century. Students should learn how to raise money for a political campaign; how to lobby; how to make an advertising budget; and how to purchase airtime in the digital age…(More)”

The Stoplight Battling to End Poverty


Nick Dall at OZY: “Over midafternoon coffees and Fantas, Robyn-Lee Abrahams and Joyce Paulse — employees at my local supermarket in Cape Town, South Africa — tell me how their lives have changed in the past 18 months. “I never dreamed my daughter would go to college,” says Paulse. “But yesterday we went online together and started filling in the forms.”

Abrahams notes how she used to live hand to mouth. “But now I’ve got a savings account, which I haven’t ever touched.” The sacrifice? “I eat less chocolate now.”

Paulse and Abrahams are just two of thousands of beneficiaries of the Poverty Stoplight, a self-evaluation tool that’s now redefining poverty in countries as diverse as Argentina and the U.K.; Mexico and Tanzania; Chile and Papua New Guinea. By getting families to rank their own economic condition red, yellow or green based upon 50 indicators, the Poverty Stoplight gives families the agency to pull themselves out of poverty and offers organizations insight into whether their programs are working.

Social entrepreneur Martín Burt, who founded Fundación Paraguaya 33 years ago to promote entrepreneurship and economic empowerment in Paraguay, developed the first, paper-based prototype of the Poverty Stoplight in 2010 to help the organization’s microfinance clients escape the poverty cycle….Because poverty is multidimensional, “you can have a family with a proper toilet but no savings,” points out Burt. Determining questionnaires span six different aspects of people’s lives, including softer indicators such as community involvement, self-confidence and family violence. The survey, a series of 50 multiple-choice questions with visual cues, is aimed at households, not individuals, because “you cannot get a 10-year-old girl out of poverty in isolation,” says Burt. Confidentiality is another critical component….(More)”.

The rush for data risks growing the North-South divide


Laura Mann and Gianluca Lazzolino at SciDevNet: “Across the world, tech firms and software developers are embedding digital platforms into humanitarian and commercial infrastructures. There’s Jembi and Hello Doctor for the healthcare sector, for example; SASSA and Tamween for social policy; and M-farmi-CowEsoko among many others for agriculture.

While such systems proliferate, it is time we asked some tough questions about who is controlling this data, and for whose benefit. There is a danger that ‘platformisation’ widens the knowledge gap between firms and scientists in poorer countries and those in more advanced economies.

Digital platforms serve three purposes. They improve interactions between service providers and users; gather transactional data about those users; and nudge them towards behaviours, activities and products considered ‘virtuous’, profitable, or valued — often because they generate more data. This data  can be extremely valuable to policy-makers interested in developing interventions, to researchers exploring socio-economic trends and to businesses seeking new markets.

But the development and use of these platforms are not always benign.

Knowledge and power

Digital technologies are knowledge technologies because they record the personal information, assets, behaviour and networks of the people that use them.

Knowledge has a somewhat gentle image of a global good shared openly and evenly across the world. But in reality, it is competitive.
Simply put, knowledge shapes economic rivalry between rich and poor countries. It influences who has power over the rules of the economic game, and it does this in three key ways.

First, firms can use knowledge and technology to become more efficient and competitive in what they do. For example, a farmer can choose to buy technologically enhanced seeds, inputs such as fertilisers, and tools to process their crop.

This technology transfer is not automatic — the farmer must first invest time to learn how to use these tools.  In this sense, economic competition between nations is partly about how well-equipped their people are in using technology effectively.

The second key way in which knowledge impacts global economic competition depends on looking at development as a shift from cut-throat commodity production towards activities that bring higher profits and wages.

In farming, for example, development means moving out of crop production alone into a position of having more control over agricultural inputs, and more involvement in distributing or marketing agricultural goods and services….(More)”.

The New York City Business Atlas: Leveling the Playing Field for Small Businesses with Open Data


Chapter by Stefaan Verhulst and Andrew Young in Smarter New York City:How City Agencies Innovate. Edited by André Corrêa d’Almeida: “While retail entrepreneurs, particularly those operating in the small-business space, are experts in their respective trades, they often lack access to high-quality information about social, environmental, and economic conditions in the neighborhoods where they operate or are considering operating.

The New York City Business Atlas, conceived by the Mayor’s Office of Data Analytics (MODA) and the Department of Small Business Services, is designed to alleviate that information gap by providing a public web-based tool that gives small businesses access to high-quality data to help them decide where to establish a new business or expand an existing one. e tool brings together a diversity of data, including business-fling data from the Department of Consumer Affairs, sales-tax data from the Department of Finance, demographic data from the census, and traffic data from Placemeter, a New York City startup focusing on real-time traffic information.

The initial iteration of the Business Atlas made useful and previously inaccessible data available to small-business owners and entrepreneurs in an innovative manner. After a few years, however, it became clear that the tool was not experiencing the level of use or creating the level of demonstrable impact anticipated. Rather than continuing down the same path or abandoning the effort entirely, MODA pivoted to a new approach, moving from the Business Atlas as a single information-providing tool to the Business Atlas as a suite of capabilities aimed at bolstering New York’s small-business community.

Through problem- and user-centered efforts, the Business Atlas is now making important insights available to stakeholders who can put it to meaningful use—from how long it takes to open a restaurant in the city to which areas are most in need of education and outreach to improve their code compliance. This chapter considers the open data environment from which the Business Atlas was launched, details the initial version of the Business Atlas and the lessons it generated and describes the pivot to this new approach….(More)”.