The many perks of using critical consumer user data for social benefit


Sushant Kumar at LiveMint: “Business models that thrive on user data have created profitable global technology companies. For comparison, market capitalization of just three tech companies, Google (Alphabet), Facebook and Amazon, combined is higher than the total market capitalization of all listed firms in India. Almost 98% of Facebook’s revenue and 84% of Alphabet’s come from serving targeted advertising powered by data collected from the users. No doubt, these tech companies provide valuable services to consumers. It is also true that profits are concentrated with private corporations and societal value for contributors of data, that is, the user, can be much more significant….

In the existing economic construct, private firms are able to deploy top scientists and sophisticated analytical tools to collect data, derive value and monetize the insights.

Imagine if personalization at this scale was available for more meaningful outcomes, such as for administering personalized treatment for diabetes, recommending crop patterns, optimizing water management and providing access to credit to the unbanked. These socially beneficial applications of data can generate undisputedly massive value.

However, handling critical data with accountability to prevent misuse is a complex and expensive task. What’s more, private sector players do not have any incentives to share the data they collect. These challenges can be resolved by setting up specialized entities that can manage data—collect, analyse, provide insights, manage consent and access rights. These entities would function as a trusted intermediary with public purpose, and may be named “data stewards”….(More)”.

See also: http://datastewards.net/ and https://datacollaboratives.org/

An Algorithm That Grants Freedom, or Takes It Away


Cade Metz and Adam Satariano at The New York Times: “…In Philadelphia, an algorithm created by a professor at the University of Pennsylvania has helped dictate the experience of probationers for at least five years.

The algorithm is one of many making decisions about people’s lives in the United States and Europe. Local authorities use so-called predictive algorithms to set police patrols, prison sentences and probation rules. In the Netherlands, an algorithm flagged welfare fraud risks. A British city rates which teenagers are most likely to become criminals.

Nearly every state in America has turned to this new sort of governance algorithm, according to the Electronic Privacy Information Center, a nonprofit dedicated to digital rights. Algorithm Watch, a watchdog in Berlin, has identified similar programs in at least 16 European countries.

As the practice spreads into new places and new parts of government, United Nations investigators, civil rights lawyers, labor unions and community organizers have been pushing back.

They are angered by a growing dependence on automated systems that are taking humans and transparency out of the process. It is often not clear how the systems are making their decisions. Is gender a factor? Age? ZIP code? It’s hard to say, since many states and countries have few rules requiring that algorithm-makers disclose their formulas.

They also worry that the biases — involving race, class and geography — of the people who create the algorithms are being baked into these systems, as ProPublica has reported. In San Jose, Calif., where an algorithm is used during arraignment hearings, an organization called Silicon Valley De-Bug interviews the family of each defendant, takes this personal information to each hearing and shares it with defenders as a kind of counterbalance to algorithms.

Two community organizers, the Media Mobilizing Project in Philadelphia and MediaJustice in Oakland, Calif., recently compiled a nationwide database of prediction algorithms. And Community Justice Exchange, a national organization that supports community organizers, is distributing a 50-page guide that advises organizers on how to confront the use of algorithms.

The algorithms are supposed to reduce the burden on understaffed agencies, cut government costs and — ideally — remove human bias. Opponents say governments haven’t shown much interest in learning what it means to take humans out of the decision making. A recent United Nations report warned that governments risked “stumbling zombie-like into a digital-welfare dystopia.”…(More)”.

Reconsidering Policy: Complexity, Governance and the State


Book by Kate Crowley, Jenny Stewart, Adrian Kay and Brian Head: “For nation-states, the contexts for developing and implementing policy have become more complex and demanding. Yet policy studies have not fully responded to the challenges and opportunities represented by these developments. Governance literature has drawn attention to a globalising and network-based policy world, but politics and the role of the state have been de-emphasised.

This book addresses this imbalance by reconsidering traditional policy-analytic concepts, and re-developing and extending new ones, in a melded approach defined as systemic institutionalism. This links policy with governance and the state and suggests how real-world issues might be substantively addressed….(More)”.

Transparent Lobbying and Democracy


Book by Šárka Laboutková, Vít Šimral and Petr Vymětal: “This book deals with the current, as yet unsolved, problem of transparency of lobbying. In the current theories and prevalent models that deal with lobbying activities, there is no reflection of the degree of transparency of lobbying, mainly due to the unclear distinction between corruption, lobbying in general, and transparent lobbying. This book provides a perspective on transparency in lobbying in a comprehensive and structured manner. It delivers an interdisciplinary approach to the topic and creates a methodology for assessing the transparency of lobbying, its role in the democratization process and a methodology for evaluating the main consequences of transparency. The new approach is applied to assess lobbying regulations in the countries of Central Eastern Europe and shows a method for how lobbying in other regions of the world may also be assessed….(More)”.

Federal Agencies Use Cellphone Location Data for Immigration Enforcement


Byron Tau and Michelle Hackman at the Wall Street Journal: “The Trump administration has bought access to a commercial database that maps the movements of millions of cellphones in America and is using it for immigration and border enforcement, according to people familiar with the matter and documents reviewed by The Wall Street Journal.

The location data is drawn from ordinary cellphone apps, including those for games, weather and e-commerce, for which the user has granted permission to log the phone’s location.

The Department of Homeland Security has used the information to detect undocumented immigrants and others who may be entering the U.S. unlawfully, according to these people and documents.

U.S. Immigration and Customs Enforcement, a division of DHS, has used the data to help identify immigrants who were later arrested, these people said. U.S. Customs and Border Protection, another agency under DHS, uses the information to look for cellphone activity in unusual places, such as remote stretches of desert that straddle the Mexican border, the people said.

The federal government’s use of such data for law enforcement purposes hasn’t previously been reported.

Experts say the information amounts to one of the largest known troves of bulk data being deployed by law enforcement in the U.S.—and that the use appears to be on firm legal footing because the government buys access to it from a commercial vendor, just as a private company could, though its use hasn’t been tested in court.

“This is a classic situation where creeping commercial surveillance in the private sector is now bleeding directly over into government,” said Alan Butler, general counsel of the Electronic Privacy Information Center, a think tank that pushes for stronger privacy laws.

According to federal spending contracts, a division of DHS that creates experimental products began buying location data in 2017 from Venntel Inc. of Herndon, Va., a small company that shares several executives and patents with Gravy Analytics, a major player in the mobile-advertising world.

In 2018, ICE bought $190,000 worth of Venntel licenses. Last September, CBP bought $1.1 million in licenses for three kinds of software, including Venntel subscriptions for location data. 

The Department of Homeland Security and its components acknowledged buying access to the data, but wouldn’t discuss details about how they are using it in law-enforcement operations. People familiar with some of the efforts say it is used to generate investigative leads about possible illegal border crossings and for detecting or tracking migrant groups.

CBP has said it has privacy protections and limits on how it uses the location information. The agency says that it accesses only a small amount of the location data and that the data it does use is anonymized to protect the privacy of Americans….(More)”

If China valued free speech, there would be no coronavirus crisis


Verna Yu in The Guardian: “…Despite the flourishing of social media, information is more tightly controlled in China than ever. In 2013, an internal Communist party edict known as Document No 9 ordered cadres to tackle seven supposedly subversive influences on society. These included western-inspired notions of press freedom, “universal values” of human rights, civil rights and civic participation. Even within the Communist party, cadres are threatened with disciplinary action for expressing opinions that differ from the leadership.

Compared with 17 years ago, Chinese citizens enjoy even fewer rights of speech and expression. A few days after 34-year-old Li posted a note in his medical school alumni social media group on 30 December, stating that seven workers from a local live-animal market had been diagnosed with an illness similar to Sars and were quarantined in his hospital, he was summoned by police. He was made to sign a humiliating statement saying he understood if he “stayed stubborn and failed to repent and continue illegal activities, (he) will be disciplined by the law”….

Unless Chinese citizens’ freedom of speech and other basic rights are respected, such crises will only happen again. With a more globalised world, the magnitude may become even greater – the death toll from the coronavirus outbreak is already comparable to the total Sars death toll.

Human rights in China may appear to have little to do with the rest of the world but as we have seen in this crisis, disaster could occur when China thwarts the freedoms of its citizens. Surely it is time the international community takes this issue more seriously….(More)”.

Housing Search in the Age of Big Data: Smarter Cities or the Same Old Blind Spots?


Paper by Geoff Boeing et al: “Housing scholars stress the importance of the information environment in shaping housing search behavior and outcomes. Rental listings have increasingly moved online over the past two decades and, in turn, online platforms like Craigslist are now central to the search process. Do these technology platforms serve as information equalizers or do they reflect traditional information inequalities that correlate with neighborhood sociodemographics? We synthesize and extend analyses of millions of US Craigslist rental listings and find they supply significantly different volumes, quality, and types of information in different communities.

Technology platforms have the potential to broaden, diversify, and equalize housing search information, but they rely on landlord behavior and, in turn, likely will not reach this potential without a significant redesign or policy intervention. Smart cities advocates hoping to build better cities through technology must critically interrogate technology platforms and big data for systematic biases….(More)”.

Re-thinking Public Innovation, Beyond Innovation in Government


Jocelyne Bourgon at Dubai Policy Review: “The situation faced by public servants and public sector leaders today may not be more challenging in absolute terms than in previous generations, but it is certainly different. The problems societies face today stem from a world characterised by increasing complexity, hyper-connectivity and a high level of uncertainty. In this context, the public sector’s role in developing innovative solutions is critical. Despite the need for public innovation, public servants (when asked to discuss the challenges they face in New Synthesis1 labs and workshops) tend to present a narrow perspective, rarely going beyond the boundary of their respective units. While recent public sector reforms have encouraged a drive for efficiency and productivity, they have also generated a narrow and sometimes distorted view of the scale of the role of government in society. Ideas and principles matter. The way one thinks has a direct impact on the solutions that will be found and the results that will be achieved. Innovation in government has received much attention over the years. For the most part, the focus has been introspective, giving special attention to the modernisation of public sector systems and practices as well as the service delivery functions of government. The focus of attention in these conversations is on innovation in government and as a result may have missed the most important contributions of government to public innovation….

I define public innovation as “innovative solutions serving a public purpose that require the use of public means”9. What distinguishes public innovation from social innovation is the intimate link to government actions and the use of instruments of the State10. From this perspective, far from being risk averse, the State is the ultimate risk taker in society. Government takes risks on a scale that no other sector or agent in society could take on and intervenes in areas where the forces of the market or the capacity of civil society would be unable to go. This broader perspective reveals some of the distinctive characteristics of public innovation….(More)”

Re-imagining “Action Research” as a Tool for Social Innovation and Public Entrepreneurship


Stefaan G. Verhulst at The GovLab: “We live in challenging times. From climate change to economic inequality and forced migration, the difficulties confronting decision-makers are unprecedented in their variety, as well as in their complexity and urgency. Our standard policy toolkit seems stale and ineffective while existing governance institutions are increasingly outdated and distrusted.

To tackle today’s challenges, we need not only new solutions but new ways of arriving at solutions. In particular, we need fresh research methodologies that can provide actionable insights on 21st century conditions. Such methodologies would allow us to redesign how decisions are made, how public services are offered, and how complex problems are solved around the world. 

Rethinking research is a vast project, with multiple components. This new essay focuses on one particular area of research: action research. In the essay, I first explain what we mean by action research, and also explore some of its potential. I subsequently argue that, despite that potential, action research is often limited as a method because it remains embedded in past methodologies; I attempt to update both its theory and practice for the 21st century.

Although this article represents only a beginning, my broader goal is to re-imagine the role of action research for social innovation, and to develop an agenda that could provide for what Amar Bhide calls “practical knowledge” at all levels of decision making in a systematic, sustainable, and responsible manner.  (Full Essay Here).”

Astroturfing Is Bad But It's Not the Whole Problem


Beth Noveck at NextGov: “In November 2019, Securities and Exchange Commission Chairman Jay Clayton boasted that draft regulations requiring proxy advisors to run their recommendations past the companies they are evaluating before giving that advice to their clients received dozens of letters of support from ordinary Americans. But the letters he cited turned out to be fakes, sent by corporate advocacy groups and signed with the names of people who never saw the comments or who do not exist at all.

When interest groups manufacture the appearance that comments come from the “ordinary public,” it’s known as astroturfing. The practice is the subject of today’s House Committee on Financial Services Subcommittee on Oversight and Investigations hearing, entitled “Fake It till They Make It: How Bad Actors Use Astroturfing to Manipulate Regulators, Disenfranchise Consumers, and Subvert the Rulemaking Process.” 

Of course, commissioners who cherry-pick from among the public comments looking for the information to prove themselves right should be called out and it is tempting to use the occasion to embarrass those who do, especially when they are from the other party. But focusing on astroturfing distracts attention away from the more salient and urgent problem: the failure to obtain the best possible evidence by creating effective public participation opportunities in federal rulemaking. 

Thousands of federal regulations are enacted every year that touch every aspect of our lives, and under the 1946 Administrative Procedure Act, the public has a right to participate.

Participation in rulemaking advances both the legitimacy and the quality of regulations by enabling agencies—and the congressional committees that oversee them—to obtain information from a wider audience of stakeholders, interest groups, businesses, nonprofits, academics and interested individuals. Participation also provides a check on the rulemaking process, helping to ensure public scrutiny.

But the shift over the last two decades to a digital process, where people submit comments via regulations.gov has made commenting easier yet also inadvertently opened the floodgates to voluminous, duplicative and, yes, even “fake” comments, making it harder for agencies to extract the information needed to inform the rulemaking process.

Although many agencies receive only a handful of comments, some receive voluminous responses, thanks to this ease of digital commenting. In 2017, when the Federal Communications Commission sought to repeal an earlier Obama-era rule requiring internet service providers to observe net neutrality, the agency received 22 million comments in response. 

There is a remedy. Tools have evolved to make quick work of large data stores….(More)”. See also https://congress.crowd.law/