Nowcasting the Local Economy: Using Yelp Data to Measure Economic Activity


Paper by Edward L. Glaeser, Hyunjin Kim and Michael Luca: “Can new data sources from online platforms help to measure local economic activity? Government datasets from agencies such as the U.S. Census Bureau provide the standard measures of local economic activity at the local level. However, these statistics typically appear only after multi-year lags, and the public-facing versions are aggregated to the county or ZIP code level. In contrast, crowdsourced data from online platforms such as Yelp are often contemporaneous and geographically finer than official government statistics. Glaeser, Kim, and Luca present evidence that Yelp data can complement government surveys by measuring economic activity in close to real time, at a granular level, and at almost any geographic scale. Changes in the number of businesses and restaurants reviewed on Yelp can predict changes in the number of overall establishments and restaurants in County Business Patterns. An algorithm using contemporaneous and lagged Yelp data can explain 29.2 percent of the residual variance after accounting for lagged CBP data, in a testing sample not used to generate the algorithm. The algorithm is more accurate for denser, wealthier, and more educated ZIP codes….(More)”.

See all papers presented at the NBER Conference on Big Data for 21st Century Economic Statistics here.

Data Pools: Wi-Fi Geolocation Spoofing


AH Projects: “DataPools is a Wi-Fi geolocation spoofing project that virtually relocates your phone to the latitudes and longitudes of Silicon Valley success. It includes a catalog and a SkyLift device with 12 pre-programmed locations. DataPools was produced for the Tropez summer art event in Berlin and in collaboration with Anastasia Kubrak.

DataPools catalog pool index

DataPools catalog pool index

Weren’t invited to Jeff Bezos’s summer pool party? No problem. DataPools uses the SkyLift device to mimick the Wi-Fi network infrastructure at 12 of the top Silicon Valley CEOs causing your phone to show up, approximately, at their pool. Because Wi-Fi spoofing affects the core geolocation services of iOS and Android smartphones, all apps on phone and the metadata they generate, will be located in the spoofed location…

Data Pools is a metaphor for a store of wealth that is private. The luxurious pools and mansions of Silicon Valley are financed by the mechanisms of economic surveillance and ownership of our personal information. Yet, the geographic locations of these premises are often concealed, hidden, and removed from open source databases. What if we could reverse this logic and plunge into the pools of ludicrous wealth, both virtually and physically? Could we apply the same methods of data extraction to highlight the ridiculous inequalities between CEOs and platform users?

Comparison of wealth distribution among top Silicon Valley CEOs

Comparison of wealth distribution among top Silicon Valley CEOs

Data

Technically, DataPools uses a Wi-Fi microcontroller programmed with the BSSIDs and SSIDs from the target locations, which were all obtained using openly published information from web searches and wigle.net. This data is then programmed onto the firmware of the SkyLift device. One SkyLift device contains all 12 pool locations. However, throughout the installation improvements were made and the updated firmware now uses one main location with multiple sub-locations to cover a larger area during installations. This method was more effective at spoofing many phones in large area and is ideal for installations….(More)”.

Finding Crtl: Visions for the Future Internet


Nesta: “In March 2019, the World Wide Web turned thirty, and October will mark the fiftieth anniversary of the internet itself. These anniversaries offer us an important opportunity to reflect on the internet’s history, but also a chance to ponder its future.

While early internet pioneers dreamed of an internet that would be open, free and decentralised, the story of the internet today is mostly a story of loss of control. Just a handful of companies determine what we read, see and buy, where we work and where we live, who we vote for, who we love, and who we are. Many of us feel increasingly uneasy about these developments. We live in a world where new technologies happen to us; the average person has very little agency to change things within the current political and economic parameters.

Yet things don’t have to be this way. In a time where the future of the internet is usually painted as bleak and uncertain, we need positive visions about where we go next.

As part of the Next Generation Internet (NGI) initiative – the European Commission’s new flagship programme working on building a more democratic, inclusive and resilient internet – we have created this “visions book”, a collection of essays, short stories, poetry and artworks from over 30 contributors from 15 countries and five continents. Each contributor has a unique background, as most were selected via an open call for submissions held last autumn. As such, the book collects both established and emerging voices, all reflecting on the same crucial questions: where did we come from, but more importantly, where do we go next?

The NGI hopes to empower everyone to take active control in shaping the future: the internet does not just belong to those who hold power today, but to all of us….(More)”.

Reconnecting citizens with EU decision-making is possible – and needs to happen now


Opinion piece by Anthony Zacharzewski: “Maybe it’s the Brexit effect, or perhaps the memories of the great recession are fading, but in poll after poll, Europe’s citizens are saying that they feel more European and strongly supportive of EU membership. …

While sighs of relief can be heard from Schuman to Strasbourg, after a decade where the EU has bounced from crisis to crisis, the new Parliament and Commission will inherit a fragile and fractious Europe this year. One of their most important tasks will immediately be to connect EU citizens more closely to the institutions and their decision making….

The new European Commission and Parliament have the chance to change that, by adopting an ambitious open government agenda that puts citizen participation in decision making at its heart.

There are three things on our wish list for doing this.

The first thing on our list is an EU-wide commitment to policy making “in the open.” Built on a renewed commitment to transparency, it would set a unified approach to consultation, as well as identifying major policy areas where citizen involvement is both valuable and where citizens are likely to want to be involved. This could include issues such as migration and climate change. Member states, particularly those who are in the Open Government Partnership, have already had a lot of good practice which can help to inform this while the Open Government Network for Europe, which brings together civil society and government voices, is ready to help.

Secondly, the connection to civil society and citizens also needs to be made beyond the European level, supporting and making use of the rapidly growing networks of democratic innovation at local level. We are seeing an increasing shift from citizen participation as one-off events into a part of the governing system, and as such, the European institutions need to listen to local conversations and support them with better information. Public Square, our own project run in partnership with mySociety and funded by Luminate, is a good example. It is working with local government and citizens to understand how meaningful citizen participation can become an everyday part of the way all local decision-making happens.

The last item on our wish list would be greater coherence between the institutions in Brussels and Strasbourg to better involve citizens. While the European Parliament, Commission and Council all have their different roles and prerogatives, without a co-ordinated approach, the attention and resources they have will be dissipated across multiple conversations. Most importantly, it will be harder to demonstrate to citizens that their contributions have made a difference….(More)”.

Computational Social Science of Disasters: Opportunities and Challenges


Paper by Annetta Burger, Talha Oz , William G. Kennedy and Andrew T. Crooks: “Disaster events and their economic impacts are trending, and climate projection studies suggest that the risks of disaster will continue to increase in the near future. Despite the broad and increasing social effects of these events, the empirical basis of disaster research is often weak, partially due to the natural paucity of observed data. At the same time, some of the early research regarding social responses to disasters have become outdated as social, cultural, and political norms have changed. The digital revolution, the open data trend, and the advancements in data science provide new opportunities for social science disaster research.

We introduce the term computational social science of disasters (CSSD), which can be formally defined as the systematic study of the social behavioral dynamics of disasters utilizing computational methods. In this paper, we discuss and showcase the opportunities and the challenges in this new approach to disaster research.

Following a brief review of the fields that relate to CSSD, namely traditional social sciences of disasters, computational social science, and crisis informatics, we examine how advances in Internet technologies offer a new lens through which to study disasters. By identifying gaps in the literature, we show how this new field could address ways to advance our understanding of the social and behavioral aspects of disasters in a digitally connected world. In doing so, our goal is to bridge the gap between data science and the social sciences of disasters in rapidly changing environments….(More)”.

The Blockchain Game: A great new tool for your classroom


IBM Blockchain Blog: “Blockchain technology can be a game-changer for accounting, supply chainbanking, contract law, and many other fields. But it will only be useful if lots and lots of non-technical managers and leaders trust and adopt it. And right now, just understanding what blockchain is, can be difficult to understand even for the brightest in these fields. Enter The Blockchain Game, a hands-on exercise that explains blockchain’s core principals, and serves as a launching pad for discussion of blockchain’s real-world applications.

In The Blockchain Game students act as nodes and miners on a blockchain network for storing student grades at a university. Participants record the grade and course information, and then “build the block” by calculating a unique identifier (a hash) to secure the grade ledger, and miners get rewarded for their work. As the game is played, the audience learns about hashes, private keys, and what uses are appropriate for a blockchain ledger.

Basics of the Game

  • A hands-on simulation centering around a blockchain for academic scores, including a discussion at the end of the simulation regarding if storing grades would be a good application for blockchain.
  • No computers. Participants are the computors and calculate blocks.
  • The game seeks to teach core concepts about a distributed ledger but can be modified to whichever use case the educator wishes to use — smart contracts, supply chain, applications and others.
  • Additional elements can be added if instructors want to facilitate the game on a computer….(More)”.

A weather tech startup wants to do forecasts based on cell phone signals


Douglas Heaven at MIT Technology Review: “On 14 April more snow fell on Chicago than it had in nearly 40 years. Weather services didn’t see it coming: they forecast one or two inches at worst. But when the late winter snowstorm came it caused widespread disruption, dumping enough snow that airlines had to cancel more than 700 flights across all of the city’s airports.

One airline did better than most, however. Instead of relying on the usual weather forecasts, it listened to ClimaCell – a Boston-based “weather tech” start-up that claims it can predict the weather more accurately than anyone else. According to the company, its correct forecast of the severity of the coming snowstorm allowed the airline to better manage its schedules and minimize losses due to delays and diversions. 

Founded in 2015, ClimaCell has spent the last few years developing the technology and business relationships that allow it to tap into millions of signals from cell phones and other wireless devices around the world. It uses the quality of these signals as a proxy for local weather conditions, such as precipitation and air quality. It also analyzes images from street cameras. It is offering a weather forecasting service to subscribers that it claims is 60 percent more accurate than that of existing providers, such as NOAA.

The internet of weather

The approach makes sense, in principle. Other forecasters use proxies, such as radar signals. But by using information from millions of everyday wireless devices, ClimaCell claims it has a far more fine-grained view of most of the globe than other forecasters get from the existing network of weather sensors, which range from ground-based devices to satellites. (ClimaCell also taps into these, too.)…(More)”.

How Technology Could Revolutionize Refugee Resettlement


Krishnadev Calamur in The Atlantic: “… For nearly 70 years, the process of interviewing, allocating, and accepting refugees has gone largely unchanged. In 1951, 145 countries came together in Geneva, Switzerland, to sign the Refugee Convention, the pact that defines who is a refugee, what refugees’ rights are, and what legal obligations states have to protect them.

This process was born of the idealism of the postwar years—an attempt to make certain that those fleeing war or persecution could find safety so that horrific moments in history, such as the Holocaust, didn’t recur. The pact may have been far from perfect, but in successive years, it was a lifeline to Afghans, Bosnians, Kurds, and others displaced by conflict.

The world is a much different place now, though. The rise of populism has brought with it a concomitant hostility toward immigrants in general and refugees in particular. Last October, a gunman who had previously posted anti-Semitic messages online against HIAS killed 11 worshippers in a Pittsburgh synagogue. Many of the policy arguments over resettlement have shifted focus from humanitarian relief to security threats and cost. The Trump administration has drastically cut the number of refugees the United States accepts, and large parts of Europe are following suit.

If it works, Annie could change that dynamic. Developed at Worcester Polytechnic Institute in Massachusetts, Lund University in Sweden, and the University of Oxford in Britain, the software uses what’s known as a matching algorithm to allocate refugees with no ties to the United States to their new homes. (Refugees with ties to the United States are resettled in places where they have family or community support; software isn’t involved in the process.)

Annie’s algorithm is based on a machine learning model in which a computer is fed huge piles of data from past placements, so that the program can refine its future recommendations. The system examines a series of variables—physical ailments, age, levels of education and languages spoken, for example—related to each refugee case. In other words, the software uses previous outcomes and current constraints to recommend where a refugee is most likely to succeed. Every city where HIAS has an office or an affiliate is given a score for each refugee. The higher the score, the better the match.

This is a drastic departure from how refugees are typically resettled. Each week, HIAS and the eight other agencies that allocate refugees in the United States make their decisions based largely on local capacity, with limited emphasis on individual characteristics or needs….(More)”.

Policies as information carriers: How environmental policies may change beliefs and consequent behavior


Paper by Ann-Kathrin Koessler and Stefanie Engel: “This paper discusses how policy interventions not only alter the legal and financial framework in which an individual is operating, but can also lead to changes in relevant beliefs. We argue that such belief changes in how an individual perceives herself, relevant others, the regulator and/or the activity in question can lead to behavioral changes that were neither intended nor expected when the policy was designed.

In the environmental economics literature, these secondary impacts of conventional policy interventions have not been systematically reviewed. Hence, we intend to raise awareness of these effects. In this paper, we review relevant research from behavioral economics and psychology, and identify and discuss the domains for which beliefs can change. Lastly, we discuss design options with which an undesired change in beliefs can be avoided when a new policy is put into practice….(More)”

How to Argue with an Algorithm: Lessons from the COMPAS ProPublica Debate


Paper by Anne L. Washington: “The United States optimizes the efficiency of its growing criminal justice system with algorithms however, legal scholars have overlooked how to frame courtroom debates about algorithmic predictions. In State v Loomis, the defense argued that the court’s consideration of risk assessments during sentencing was a violation of due process because the accuracy of the algorithmic prediction could not be verified. The Wisconsin Supreme Court upheld the consideration of predictive risk at sentencing because the assessment was disclosed and the defendant could challenge the prediction by verifying the accuracy of data fed into the algorithm.

Was the court correct about how to argue with an algorithm?

The Loomis court ignored the computational procedures that processed the data within the algorithm. How algorithms calculate data is equally as important as the quality of the data calculated. The arguments in Loomis revealed a need for new forms of reasoning to justify the logic of evidence-based tools. A “data science reasoning” could provide ways to dispute the integrity of predictive algorithms with arguments grounded in how the technology works.

This article’s contribution is a series of arguments that could support due process claims concerning predictive algorithms, specifically the Correctional Offender Management Profiling for Alternative Sanctions (“COMPAS”) risk assessment. As a comprehensive treatment, this article outlines the due process arguments in Loomis, analyzes arguments in an ongoing academic debate about COMPAS, and proposes alternative arguments based on the algorithm’s organizational context….(More)”