Analytics Tools Could Be the Key to Effective Message-Driven Nudging


 in Government Technology: “Appealing to the nuances of the human mind has been a feature of effective governance for as long as governance has existed, appearing prominently in the prescriptions of every great political theorist from Plato to Machiavelli. The most recent and informed iteration of this practice is nudging: leveraging insights about how humans think from behavioral science to create initiatives that encourage desirable behaviors.

Public officials nudge in many ways. Some seek to modify people’s behavior by changing the environments in which they make decisions, for instance moving vegetables to the front of a grocery store to promote healthy eating. Others try to make desirable behaviors easier, like streamlining a city website to make it simpler to sign up for a service. Still others use prompts like email reminders of a deadline to receive a free checkup to nudge people to act wisely by providing useful information.

Thus far, examples of the third type of nudging — direct messaging that prompts behavior — have been decidedly low tech. Typical initiatives have included sending behaviorally informed letters to residents who have not complied with a city code or mailing out postcard reminders to renew license plates. Governments have been attracted to these initiatives for their low cost and proven effectiveness.

While these low-tech nudges should certainly continue, cities’ recent adoption of tools that can mine and analyze data instantaneously has the potential to greatly increase the scope and effectiveness of message-driven nudging.

For one, using Internet of Things (IoT) ecosystems, cities can provide residents with real-time information so that they may make better-informed decisions. For example, cities could connect traffic sensors to messaging systems and send subscribers text messages at times of high congestion, encouraging them to take public transportation. This real-time information, paired with other nudges, could increase transit use, easing traffic and bettering the environment…
Instantaneous data-mining tools may also prove useful for nudging citizens in real time, at the moments they are most likely to partake in detrimental behavior. Tools like machine learning can analyze users’ behavior and determine if they are likely to make a suboptimal choice, like leaving the website for a city service without enrolling. Using clickstream data, the site could determine if a user is likely to leave and deliver a nudge, for example sending a message explaining that most residents enroll in the service. This strategy provides another layer of nudging, catching residents who may have been influenced by an initial nudge — like a reminder to sign up for a service or streamlined website — but may need an extra prod to follow through….(More)”

Closing the Loop


Chris Anderson: “If we could measure the world, how would we manage it differently? This is a question we’ve been asking ourselves in the digital realm since the birth of the Internet. Our digital lives—clicks, histories, and cookies—can now be measured beautifully. The feedback loop is complete; it’s called closing the loop. As you know, we can only manage what we can measure. We’re now measuring on-screen activity beautifully, but most of the world is not on screens.

As we get better and better at measuring the world—wearables, Internet of Things, cars, satellites, drones, sensors—we are going to be able to close the loop in industry, agriculture, and the environment. We’re going to start to find out what the consequences of our actions are and, presumably, we’ll take smarter actions as a result. This journey with the Internet that we started more than twenty years ago is now extending to the physical world. Every industry is going to have to ask the same questions: What do we want to measure? What do we do with that data? How can we manage things differently once we have that data? This notion of closing the loop everywhere is perhaps the biggest endeavor of our age.

Closing the loop is a phrase used in robotics. Open-loop systems are when you take an action and you can’t measure the results—there’s no feedback. Closed-loop systems are when you take an action, you measure the results, and you change your action accordingly. Systems with closed loops have feedback loops; they self-adjust and quickly stabilize in optimal conditions. Systems with open loops overshoot; they miss it entirely…(More)”

What is the Spectrum of Public Participation?


Spectrum of Public Participation

Using the Spectrum of Public Participation

Many practitioners and organisations find the Spectrum very helpful. The IAP2 claims that the Spectrum is “quickly becoming an international standard” and, while this claim is partly marketing, it certainly has some validity in some sectors. In Australia, the Spectrum forms a basis for many state and federal government guides to community engagement (e.g., Department Environment, Land, Water and Planning, Department of Primary Industries) local government community engagement plans (e.g., City of Newcastle, Latrobe City and the Local Government Association of South Australia ) and a range of other organisations (e.g., Australian Water Recycling Centre of Excellence and Trinity Grammar School).

While not as widely used in other parts of the world, it is still relevant and has been used in a range of contexts (e.g., The United States Environmental Protection Agency, the British Forestry Commission and Vancouver’s Engage City Task Force).

….

Selecting a level

The Spectrum is not a flow chart. They are not steps in a process – starting on the left and working to the right – so selecting a level needs to be based on the specific context.

Higher levels are not necessarily “better”. If an issue is not controversial and does not provoke passionate feelings, a lower level maybe more appropriate, but for issues which are complex and controversial, it can save time in the long run to choose a higher level ….

Selecting a level of participation does not mean that the level cannot change, (e.g., it might be discovered that an issue was more controversial than thought, and so a higher level might be adopted) nor is the selected level the only one that can be used. It can be quite appropriate to provide ways of engaging the community at lower levels than the level selected. For example, some people may not have the time and energy to participate in day long workshop held at the Collaborate level, but might still want to have the opportunity to contribute their ideas.

The level is only part of the picture

Community engagement needs to have strong ethical base. Selecting appropriate levels is important but the way we engage the community and who we engage are also vitally important.

The Spectrum of Public Participation is underpinned by seven values.….

The Spectrum is a useful tool in thinking about, and planning, community engagement that has helped many practitioners in a wide range of contexts. Although there are examples where it has been used poorly, it provides a valuable starting place and can, in fact, be used to challenge poor community engagement practice….(More)”

Fighting Illegal Fishing With Big Data


Emily Matchar in Smithsonian: “In many ways, the ocean is the Wild West. The distances are vast, the law enforcement agents few and far between, and the legal jurisdiction often unclear. In this environment, illegal activity flourishes. Illegal fishing is so common that experts estimate as much as a third of fish sold in the U.S. was fished illegally. This illegal fishing decimates the ocean’s already dwindling fish populations and gives rise to modern slavery, where fishermen are tricked onto vessels and forced to work, sometimes for years.

A new use of data technology aims to help curb these abuses by shining a light on the high seas. The technology uses ships’ satellite signals to detect instances of transshipment, when two vessels meet at sea to exchange cargo. As transshipment is a major way illegally caught fish makes it into the legal supply chain, tracking it could potentially help stop the practice.

“[Transshipment] really allows people to do something out of sight,” says David Kroodsma, the research program director at Global Fishing Watch, an online data platform launched by Google in partnership with the nonprofits Oceana and SkyTruth. “It’s something that obscures supply chains. It’s basically being able to do things without any oversight. And that’s a problem when you’re using a shared resource like the oceans.”

Global Fishing Watch analyzed some 21 billion satellite signals broadcast by ships, which are required to carry transceivers for collision avoidance, from between 2012 and 2016. It then used an artificial intelligence system it created to identify which ships were refrigerated cargo vessels (known in the industry as “reefers”). They then verified this information with fishery registries and other sources, eventually identifying 794 reefers—90 percent of the world’s total number of such vessels. They tracked instances where a reefer and a fishing vessel were moving at similar speeds in close proximity, labeling these instances as “likely transshipments,” and also traced instances where reefers were traveling in a way that indicated a rendezvous with a fishing vessel, even if no fishing vessel was present—fishing vessels often turn off their satellite systems when they don’t want to be seen. All in all there were more than 90,000 likely or potential transshipments recorded.

Even if these encounters were in fact transshipments, they would not all have been for nefarious purposes. They may have taken place to refuel or load up on supplies. But looking at the patterns of where the potential transshipments happen is revealing. Very few are seen close to the coasts of the U.S., Canada and much of Europe, all places with tight fishery regulations. There are hotspots off the coast of Peru and Argentina, all over Africa, and off the coast of Russia. Some 40 percent of encounters happen in international waters, far enough off the coast that no country has jurisdiction.

The tracked reefers were flying flags from some 40 different countries. But that doesn’t necessarily tell us much about where they really come from. Nearly half of the reefers tracked were flying “flags of convenience,” meaning they’re registered in countries other than where the ship’s owners are from to take advantage of those countries’ lax regulations….(More)”

Read more: http://www.smithsonianmag.com/innovation/fighting-illegal-fishing-big-data-180962321/#7eCwGrGS5v5gWjFz.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

In Beta: Is policymaking stuck in the 19th century?


Global Partners Digital: “Today we’re launching a new series of podcasts – titled In beta – with the aim of critically examining the big questions facing human rights in the digital environment.

The series will be hosted by GPD’s executive director, Charles Bradley, who will interview a different guest – or guests – for each episode.

But before we go into details, a little more on the concept. We’ve created In beta because we felt that there weren’t enough forums for genuine debate and discussion within the digital rights community. We felt that we needed a space where we could host interesting conversations with interesting people in our field, outside of the conventions of traditional policy discourse; which can sometimes work to confine people in silos, and discourage more open, experimental thinking.

The series is called In beta because these conversations will be speculative, not definitive. The questions we examine won’t be easy – or even possible – to answer. They may sometimes be provocative. They may themselves raise new questions, and perhaps lay the groundwork for future work.

In the first episode, we talk to the c0-founder of GovLab, Stefaan Verhulst, asking – ‘Is policymaking stuck in the 19th century?’…(More)”

Why Big Data Is a Big Deal for Cities


John M. Kamensky in Governing: “We hear a lot about “big data” and its potential value to government. But is it really fulfilling the high expectations that advocates have assigned to it? Is it really producing better public-sector decisions? It may be years before we have definitive answers to those questions, but new research suggests that it’s worth paying a lot of attention to.

University of Kansas Prof. Alfred Ho recently surveyed 65 mid-size and large cities to learn what is going on, on the front line, with the use of big data in making decisions. He found that big data has made it possible to “change the time span of a decision-making cycle by allowing real-time analysis of data to instantly inform decision-making.” This decision-making occurs in areas as diverse as program management, strategic planning, budgeting, performance reporting and citizen engagement.

Cities are natural repositories of big data that can be integrated and analyzed for policy- and program-management purposes. These repositories include data from public safety, education, health and social services, environment and energy, culture and recreation, and community and business development. They include both structured data, such as financial and tax transactions, and unstructured data, such as recorded sounds from gunshots and videos of pedestrian movement patterns. And they include data supplied by the public, such as the Boston residents who use a phone app to measure road quality and report problems.

These data repositories, Ho writes, are “fundamental building blocks,” but the challenge is to shift the ownership of data from separate departments to an integrated platform where the data can be shared.

There’s plenty of evidence that cities are moving in that direction and that they already are systematically using big data to make operational decisions. Among the 65 cities that Ho examined, he found that 49 have “some form of data analytics initiatives or projects” and that 30 have established “a multi-departmental team structure to do strategic planning for these data initiatives.”….The effective use of big data can lead to dialogs that cut across school-district, city, county, business and nonprofit-sector boundaries. But more importantly, it provides city leaders with the capacity to respond to citizens’ concerns more quickly and effectively….(More)”

DataRefuge


DataRefuge is a public, collaborative project designed to address the following concerns about federal climate and environmental data:

  • What are the best ways to safeguard data?
  • How do federal agencies play crucial roles in data collection, management, and distribution?
  • How do government priorities impact data’s accessibility?
  • Which projects and research fields depend on federal data?
  • Which data sets are of value to research and local communities, and why?

DataRefuge is also an initiative committed to identifying, assessing, prioritizing, securing, and distributing reliable copies of federal climate and environmental data so that it remains available to researchers. Data collected as part of the #DataRefuge initiative will be stored in multiple, trusted locations to help ensure continued accessibility.

DataRefuge acknowledges–and in fact draws attention to–the fact that there are no guarantees of perfectly safe information. But there are ways that we can create safe and trustworthy copies. DataRefuge is thus also a project to develop the best methods, practices, and protocols to do so.

DataRefuge depends on local communities. We welcome new collaborators who want to organize DataRescue Events or build DataRefuge in other ways.

There are many ways to be involved with building DataRefuge. They’re not mutually exclusive!…(More)”

Corporate Social Responsibility for a Data Age


Stefaan G. Verhulst in the Stanford Social Innovation Review: “Proprietary data can help improve and save lives, but fully harnessing its potential will require a cultural transformation in the way companies, governments, and other organizations treat and act on data….

We live, as it is now common to point out, in an era of big data. The proliferation of apps, social media, and e-commerce platforms, as well as sensor-rich consumer devices like mobile phones, wearable devices, commercial cameras, and even cars generate zettabytes of data about the environment and about us.

Yet much of the most valuable data resides with the private sector—for example, in the form of click histories, online purchases, sensor data, and call data records. This limits its potential to benefit the public and to turn data into a social asset. Consider how data held by business could help improve policy interventions (such as better urban planning) or resiliency at a time of climate change, or help design better public services to increase food security.

Data responsibility suggests steps that organizations can take to break down these private barriers and foster so-called data collaboratives, or ways to share their proprietary data for the public good. For the private sector, data responsibility represents a new type of corporate social responsibility for the 21st century.

While Nepal’s Ncell belongs to a relatively small group of corporations that have shared their data, there are a few encouraging signs that the practice is gaining momentum. In Jakarta, for example, Twitter exchanged some of its data with researchers who used it to gather and display real-time information about massive floods. The resulting website, PetaJakarta.org, enabled better flood assessment and management processes. And in Senegal, the Data for Development project has brought together leading cellular operators to share anonymous data to identify patterns that could help improve health, agriculture, urban planning, energy, and national statistics.

Examples like this suggest that proprietary data can help improve and save lives. But to fully harness the potential of data, data holders need to fulfill at least three conditions. I call these the “the three pillars of data responsibility.”…

The difficulty of translating insights into results points to some of the larger social, political, and institutional shifts required to achieve the vision of data responsibility in the 21st century. The move from data shielding to data sharing will require that we make a cultural transformation in the way companies, governments, and other organizations treat and act on data. We must incorporate new levels of pro-activeness, and make often-unfamiliar commitments to transparency and accountability.

By way of conclusion, here are four immediate steps—essential but not exhaustive—we can take to move forward:

  1. Data holders should issue a public commitment to data responsibility so that it becomes the default—an expected, standard behavior within organizations.
  2. Organizations should hire data stewards to determine what and when to share, and how to protect and act on data.
  3. We must develop a data responsibility decision tree to assess the value and risk of corporate data along the data lifecycle.
  4. Above all, we need a data responsibility movement; it is time to demand data responsibility to ensure data improves and safeguards people’s lives…(More)”

RideComfort: A Development of Crowdsourcing Smartphones in Measuring Train Ride Quality


Adam Azzoug and Sakdirat Kaewunruen in Frontiers in Built Environment: “Among the many million train journeys taking place every day, not all of them are being measured or monitored for ride comfort. Improving ride comfort is important for railway companies to attract more passengers to their train services. Giving passengers the ability to measure ride comfort themselves using their smartphones allows railway companies to receive instant feedback from passengers regarding the ride quality on their trains. The purpose of this development is to investigate the feasibility of using smartphones to measure vibration-based ride comfort on trains. This can be accomplished by developing a smartphone application, analyzing the data recorded by the application, and verifying the data by comparing it to data from a track inspection vehicle or an accelerometer. A literature review was undertaken to examine the commonly used standards to evaluate ride comfort, such as the BS ISO 2631-1:1997 standard and Sperling’s ride index as proposed by Sperling and Betzhold (1956). The literature review has also revealed some physical causes of ride discomfort such as vibrations induced by roughness and irregularities present at the wheel/rail interface. We are the first to use artificial neural networks to map data derived from smartphones in order to evaluate ride quality. Our work demonstrates the merits of using smartphones to measure ride comfort aboard trains and suggests recommendations for future technological improvement. Our data argue that the accelerometers found in modern smartphones are of sufficient quality to be used in evaluating ride comfort. The ride comfort levels predicted both by BS ISO 2631-1 and Sperling’s index exhibit excellent agreement…(More)”

Rules for a Flat World – Why Humans Invented Law and How to Reinvent It for a Complex Global Economy


Book by Gillian Hadfield: “… picks up where New York Times columnist Thomas Friedman left off in his influential 2005 book, The World is Flat. Friedman was focused on the infrastructure of communications and technology-the new web-based platform that allows business to follow the hunt for lower costs, higher value and greater efficiency around the planet seemingly oblivious to the boundaries of nation states. Hadfield peels back this technological platform to look at the ‘structure that lies beneath’—our legal infrastructure, the platform of rules about who can do what, when and how. Often taken for granted, economic growth throughout human history has depended at least as much on the evolution of new systems of rules to support ever-more complex modes of cooperation and trade as it has on technological innovation. When Google rolled out YouTube in over one hundred countries around the globe simultaneously, for example, it faced not only the challenges of technology but also the staggering problem of how to build success in the context of a bewildering and often conflicting patchwork of nation-state-based laws and legal systems affecting every aspect of the business-contract, copyright, encryption, censorship, advertising and more. Google is not alone. A study presented at the World Economic Forum in Davos in 2011 found that for global firms, the number one challenge of the modern economy is increasing complexity, and the number one source of complexity is law. Today, even our startups, the engines of economic growth, are global from Day One.

Put simply, the law and legal methods on which we currently rely have failed to evolve along with technology. They are increasingly unable to cope with the speed, complexity, and constant border-crossing of our new globally inter-connected environment. Our current legal systems are still rooted in the politics-based nation state platform on which the industrial revolution was built. Hadfield argues that even though these systems supported fantastic growth over the past two centuries, today they are too slow, costly, cumbersome and localized to support the exponential rise in economic complexity they fostered. …

The answer to our troubles with law, however, is not the one critics usually reach for—to have less of it. Recognizing that law provides critical infrastructure for the cooperation and collaboration on which economic growth is built is the first step, Hadfield argues, to building a legal environment that does more of what we need it to do and less of what we don’t. …(More)”