The 4M Roadmap: A Higher Road to Profitability by Using Big Data for Social Good


Report by Brennan Lake: “As the private sector faces conflicting pressures to either embrace or shun socially responsible practices, companies with privately held big-data assets must decide whether to share access to their data for public good. While some managers object to data sharing over concerns of privacy and product cannibalization, others launch well intentioned yet short-lived CSR projects that fail to deliver on lofty goals.

By embedding Shared-Value principles into ‘Data-for-Good’ programs, data-rich firms can launch responsible data-sharing initiatives that minimize risk, deliver sustained impact, and improve overall competitiveness in the process.

The 4M Roadmap by Brennan Lake, a Big-Data and Social Impact professional, guides managers to adopt a ‘Data-for-Good’ model that emphasizes four key pillars of value-creation: Mission, Messaging, Methods, and Monetization. Through deep analysis and private-sector case studies, The 4M Roadmap demonstrates how companies can engage in responsible data sharing to benefit society and business alike…(More)”.

Preparing Researchers for an Era of Freer Information


Article by Peter W.B. Phillips: “If you Google my name along with “Monsanto,” you will find a series of allegations from 2013 that my scholarly work at the University of Saskatchewan, focused on technological change in the global food system, had been unduly influenced by corporations. The allegations made use of seven freedom of information (FOI) requests. Although leadership at my university determined that my publications were consistent with university policy, the ensuing media attention, I feel, has led some colleagues, students, and partners to distance themselves to avoid being implicated by association.

In the years since, I’ve realized that my experience is not unique. I have communicated with other academics who have experienced similar FOI requests related to genetically modified organisms in the United States, Canada, England, Netherlands, and Brazil. And my field is not the only one affected: a 2015 Union of Concerned Scientists report documented requests in multiple states and disciplines—from history to climate science to epidemiology—as well as across ideologies. In the University of California system alone, researchers have received open records requests related to research on the health effects of toxic chemicals, the safety of abortions performed by clinicians rather than doctors, and the green energy production infrastructure. These requests are made possible by laws that permit anyone, for any reason, to gain access to public agencies’ records.

These open records campaigns, which are conducted by individuals and groups across the political spectrum, arise in part from the confluence of two unrelated phenomena: the changing nature of academic research toward more translational, interdisciplinary, and/or team-based investigations and the push for more transparency in taxpayer-funded institutions. Neither phenomenon is inherently negative; in fact, there are strong advantages for science and society in both trends. But problems arise when scholars are caught between them—affecting the individuals involved and potentially influencing the ongoing conduct of research…(More)”

Exploring Visitor Density Trends in Rest Areas Through Google Maps Data and Data Mining


Paper by Marita Prasetyani, R. Rizal Isnanto and Catur Edi Widodo: “Rest areas play a vital role in ensuring the safety and comfort of travelers. This study examines the visitor density at the toll and non-toll rest areas using data mining techniques applied to Google Maps Places data. By utilizing extensive information from Google Maps, the research aims to uncover patterns and trends in visitor behavior and pinpoint peak usage times. The findings can guide improved planning and management of rest areas, thereby enhancing the overall travel experience for road users and further research to determine the location of the new rest area.Understanding patterns or trends in visitor density at rest areas involves analyzing the time of day, location, and other factors influencing the density level. Understanding these trends can provide essential insights for rest area management, infrastructure planning, and the establishment of new rest areas.Data from Google Maps provides an invaluable source of real-time and historical information, enabling accurate and in-depth analysis of visitor behavior.Data mining helps identify relationships not immediately apparent in the data, providing a deeper understanding and supporting data-driven decision-making…(More)”.

The Essential Principle for Appropriate Data Policy of Citizen Science Projects


Chapter by Takeshi Osawa: “Citizen science is one of new paradigms of science. This concept features various project forms, participants, and motivations and implies the need for attention to ethical issues for every participant, which frequently includes nonacademics. In this chapter, I address ethical issues associated with citizen science projects that focus on the data treatment rule and demonstrate a concept on appropriate data policy for these projects. First, I demonstrate that citizen science projects tend to include different types of collaboration, which may lead to certain conflicts among participants in terms of data sharing. Second, I propose an idea that could integrate different types of collaboration according to the theory transcend. Third, I take a case of a citizen science project through which transcend occurred and elucidate the difference between ordinal research and citizen science projects, specifically in terms of the goals of these projects and the goals and motivations of participants, which may change. Finally, I proposed one conceptual idea on how the principal investigator (PI) of a citizen science project can establish data policy after assessing the rights of participants. The basic idea is the division and organization of the data policy in a hierarchy for the project and for the participants. Data policy is one of the important items for establishing the appropriate methods for citizen science as new style of science. As such, practice and framing related to data policy must be carefully monitored and reflected on…(More)”.

Mission Driven Bureaucrats: Empowering People To Help Government Do Better


Book by Dan Honig: “…argues that the performance of our governments can be transformed by managing bureaucrats for their empowerment rather than for compliance. Aimed at public sector workers, leaders, academics, and citizens alike, it contends that public sectors too often rely on a managerial approach that seeks to tightly monitor and control employees, and thus demotivates and repels the mission-motivated. The book suggests that better performance can in many cases come from a more empowerment-oriented managerial approach—which allows autonomy, cultivates feelings of competence, and creates connection to peers and purpose—which allows the mission-motivated to thrive. Arguing against conventional wisdom, the volume argues that compliance often thwarts, rather than enhances, public value—and that we can often get less corruption and malfeasance with less monitoring. It provides a handbook of strategies for managers to introduce empowerment-oriented strategies into their agency. It also describes what everyday citizens can do to support the empowerment of bureaucrats in their governments. Interspersed throughout this book are featured profiles of real-life Mission Driven Bureaucrats, who exemplify the dedication and motivation that is typical of many civil servants. Drawing on original empirical data from a number of countries and the prior work of other scholars from around the globe, the volume argues that empowerment-oriented management and how to cultivate, support, attract, and retain Mission Driven Bureaucrats should have a larger place in our thinking and practice…(More)”.

Connecting the dots: AI is eating the web that enabled it


Article by Tom Wheeler: “The large language models (LLMs) of generative AI that scraped their training data from websites are now using that data to eliminate the need to go to many of those same websites. Respected digital commentator Casey Newton concluded, “the web is entering a state of managed decline.” The Washington Post headline was more dire: “Web publishers brace for carnage as Google adds AI answers.”…

Created by Sir Tim Berners-Lee in 1989, the World Wide Web redefined the nature of the internet into a user-friendly linkage of diverse information repositories. “The first decade of the web…was decentralized with a long-tail of content and options,” Berners-Lee wrote this year on the occasion of its 35th anniversary.  Over the intervening decades, that vision of distributed sources of information has faced multiple challenges. The dilution of decentralization began with powerful centralized hubs such as Facebook and Google that directed user traffic. Now comes the ultimate disintegration of Berners-Lee’s vision as generative AI reduces traffic to websites by recasting their information.

The web’s open access to the world’s information trained the large language models (LLMs) of generative AI. Now, those generative AI models are coming for their progenitor.

The web allowed users to discover diverse sources of information from which to draw conclusions. AI cuts out the intellectual middleman to go directly to conclusions from a centralized source.

The AI paradigm of cutting out the middleman appears to have been further advanced in Apple’s recent announcement that it will incorporate OpenAI to enable its Siri app to provide ChatGPT-like answers. With this new deal, Apple becomes an AI-based disintermediator, not only eliminating the need to go to websites, but also potentially disintermediating the need for the Google search engine for which Apple has been paying $20 billion annually.

The AtlanticUniversity of Toronto, and Gartner studies suggest the Pew research on website mortality could be just the beginning. Generative AI’s ability to deliver conclusions cannibalizes traffic to individual websites threatening the raison d’être of all websites, especially those that are commercially supported…(More)” 

This free app is the experts’ choice for wildfire information


Article by Shira Ovide: “One of the most trusted sources of information about wildfires is an app that’s mostly run by volunteers and on a shoestring budget.

It’s called Watch Duty, and it started in 2021 as a passion project of a Silicon Valley start-up founder, John Mills. He moved to a wildfire-prone area in Northern California and felt terrified by how difficult it was to find reliable information about fire dangers.

One expert after another said Watch Duty is their go-to resource for information, including maps of wildfires, the activities of firefighting crews, air-quality alerts and official evacuation orders…

More than a decade ago, Mills started a software company that helped chain restaurants with tasks such as food safety checklists. In 2019, Mills bought property north of San Francisco that he expected to be a future home. He stayed there when the pandemic hit in 2020.

During wildfires that year, Mills said he didn’t have enough information about what was happening and what to do. He found himself glued to social media posts from hobbyists who compiled wildfire information from public safety communications that are streamed online.

Mills said the idea for Watch Duty came from his experiences, his discussions with community groups and local officials — and watching an emergency services center struggle with clunky software for dispatching help.

He put in $1 million of his money to start Watch Duty and persuaded people he knew in Silicon Valley to help him write the app’s computer code. Mills also recruited some of the people who had built social media followings for their wildfire posts.

In the first week that Watch Duty was available in three California counties, Mills said, the app had tens of thousands of users. In the past month, he said, Watch Duty has hadroughly 1.1 million users.

Watch Duty is a nonprofit. Members who pay $25 a year have access to extra features such as flight tracking for firefighting aircraft.

Mills wants to expand Watch Duty to cover other types of natural disasters. “I can’t think of anything better I can do with my life than this,” he said…(More)”.

Using AI to Inform Policymaking


Paper for the AI4Democracy series at The Center for the Governance of Change at IE University: “Good policymaking requires a multifaceted approach, incorporating diverse tools and processes to address the varied needs and expectations of constituents. The paper by Turan and McKenzie focuses on an LLM-based tool, “Talk to the City” (TttC), developed to facilitate collective decision-making by soliciting, analyzing, and organizing public opinion. This tool has been tested in three distinct applications:

1. Finding Shared Principles within Constituencies: Through large-scale citizen consultations, TttC helps identify common values and priorities.

2. Compiling Shared Experiences in Community Organizing: The tool aggregates and synthesizes the experiences of community members, providing a cohesive overview.

3. Action-Oriented Decision Making in Decentralized Governance: TttC supports decision-making processes in decentralized governance structures by providing actionable insights from diverse inputs.

CAPABILITIES AND BENEFITS OF LLM TOOLS

LLMs, when applied to democratic decision-making, offer significant advantages:

  • Processing Large Volumes of Qualitative Inputs: LLMs can handle extensive qualitative data, summarizing discussions and identifying overarching themes with high accuracy.
  • Producing Aggregate Descriptions in Natural Language: The ability to generate clear, comprehensible summaries from complex data makes these tools invaluable for communicating nuanced topics.
  • Facilitating Understanding of Constituents’ Needs: By organizing public input, LLM tools help leaders gain a better understanding of their constituents’ needs and priorities.

CASE STUDIES AND TOOL EFFICACY

The paper presents case studies using TttC, demonstrating its effectiveness in improving collective deliberation and decision-making. Key functionalities include:

  • Aggregating Responses and Clustering Ideas: TttC identifies common themes and divergences within a population’s opinions.
  • Interactive Interface for Exploration: The tool provides an interactive platform for exploring the diversity of opinions at both individual and group scales, revealing complexity, common ground, and polarization…(More)”

Is Software Eating the World?


Paper by Sangmin Aum & Yongseok Shin: “When explaining the declining labor income share in advanced economies, the macro literature finds that the elasticity of substitution between capital and labor is greater than one. However, the vast majority of micro-level estimates shows that capital and labor are complements (elasticity less than one). Using firm- and establishment-level data from Korea, we divide capital into equipment and software, as they may interact with labor in different ways. Our estimation shows that equipment and labor are complements (elasticity 0.6), consistent with other micro-level estimates, but software and labor are substitutes (1.6), a novel finding that helps reconcile the macro vs. micro-literature elasticity discord. As the quality of software improves, labor shares fall within firms because of factor substitution and endogenously rising markups. In addition, production reallocates toward firms that use software more intensively, as they become effectively more productive. Because in the data these firms have higher markups and lower labor shares, the reallocation further raises the aggregate markup and reduces the aggregate labor share. The rise of software accounts for two-thirds of the labor share decline in Korea between 1990 and 2018. The factor substitution and the markup channels are equally important. On the other hand, the falling equipment price plays a minor role, because the factor substitution and the markup channels offset each other…(More)”.

The use of AI for improving energy security


Rand Report: “Electricity systems around the world are under pressure due to aging infrastructure, rising demand for electricity and the need to decarbonise energy supplies at pace. Artificial intelligence (AI) applications have potential to help address these pressures and increase overall energy security. For example, AI applications can reduce peak demand through demand response, improve the efficiency of wind farms and facilitate the integration of large numbers of electric vehicles into the power grid. However, the widespread deployment of AI applications could also come with heightened cybersecurity risks, the risk of unexplained or unexpected actions, or supplier dependency and vendor lock-in. The speed at which AI is developing means many of these opportunities and risks are not yet well understood.

The aim of this study was to provide insight into the state of AI applications for the power grid and the associated risks and opportunities. Researchers conducted a focused scan of the scientific literature to find examples of relevant AI applications in the United States, the European Union, China and the United Kingdom…(More)”.