7 things we’ve learned about computer algorithms


Aaron Smith at Pew Research Center: “Algorithms are all around us, using massive stores of data and complex analytics to make decisions with often significant impacts on humans – from choosing the content people see on social media to judging whether a person is a good credit risk or job candidate. Pew Research Center released several reports in 2018 that explored the role and meaning of algorithms in people’s lives today. Here are some of the key themes that emerged from that research.

  1. Algorithmically generated content platforms play a prominent role in Americans’ information diets. Sizable shares of U.S. adults now get news on sites like Facebook or YouTube that use algorithms to curate the content they show to their users. A study by the Center found that 81% of YouTube users say they at least occasionally watch the videos suggested by the platform’s recommendation algorithm, and that these recommendations encourage users to watch progressively longer content as they click through the videos suggested by the site.
  2. The inner workings of even the most common algorithms can be confusing to users. Facebook is among the most popular social media platforms, but roughly half of Facebook users – including six-in-ten users ages 50 and older – say they do not understand how the site’s algorithmically generated news feed selects which posts to show them. And around three-quarters of Facebook users are not aware that the site automatically estimates their interests and preferences based on their online behaviors in order to deliver them targeted advertisements and other content.
  3. The public is wary of computer algorithms being used to make decisions with real-world consequences. The public expresses widespread concern about companies and other institutions using computer algorithms in situations with potential impacts on people’s lives. More than half (56%) of U.S. adults think it is unacceptable to use automated criminal risk scores when evaluating people who are up for parole. And 68% think it is unacceptable for companies to collect large quantities of data about individuals for the purposes of offering them deals or other financial incentives. When asked to elaborate about their worries, many feel that these programs violate people’s privacy, are unfair, or simply will not work as well as decisions made by humans….(More)”.

Technology and National Security


Book from the Aspen Strategy Group: “This edition is a collection of papers commissioned for the 2018 Aspen Strategy Group Summer Workshop, a bipartisan meeting of national security experts, academics, private sector leaders, and technologists. The chapters in this volume evaluate the disruptive nature of technological change on the US military, economic power, and democratic governance. They highlight possible avenues for US defense modernization, the impact of disinformation tactics and hybrid warfare on democratic institutions, and the need for a reinvigorated innovation triangle comprised of the US government, academia, and private corporations. The executive summary offers practical recommendations to meet the daunting challenges this technological era imposes….(More)”.

Congress needs your input (but don’t call it crowdsourcing)


Lorelei Kelly at TechCrunch: “As it stands, Congress does not have the technical infrastructure to ingest all this new input in any systematic way. Individual members lack a method to sort and filter signal from noise or trusted credible knowledge from malicious falsehood and hype.

What Congress needs is curation, not just more information

Curation means discovering, gathering and presenting content. This word is commonly thought of as the job of librarians and museums, places we go to find authentic and authoritative knowledge. Similarly, Congress needs methods to sort and filter information as required within the workflow of lawmaking. From personal offices to committees, members and their staff need context and informed judgement based on broadly defined expertise. The input can come from individuals or institutions. It can come from the wisdom of colleagues in Congress or across the federal government. Most importantly, it needs to be rooted in local constituents and it needs to be trusted.

It is not to say that crowdsourcing is unimportant for our governing system. But input methods that include digital must demonstrate informed and accountable deliberative methods over time. Governing is the curation part of democracy. Governing requires public review, understanding of context, explanation and measurements of value for the nation as a whole. We are already thinking about how to create an ethical blockchain. Why not the same attention for our most important democratic institution?

Governing requires trade-offs that elicit emotion and sometimes anger. But as in life, emotions require self-regulation. In Congress, this means compromise and negotiation. In fact, one of the reasons Congress is so stuck is that its own deliberative process has declined at every level. Besides the official committee process stalling out, members have few opportunities to be together as colleagues, and public space is increasingly antagonistic and dangerous.

With so few options, members are left with blunt communications objects like clunky mail management systems and partisan talking points. This means that lawmakers don’t use public input for policy formation as much as to surveil public opinion.

Any path forward to the 21st century must include new methods to (1) curate and hear from the public in a way that informs policy AND (2) incorporate real data into a results-driven process.

While our democracy is facing unprecedented stress, there are bright spots. Congress is again dedicating resources to an in-house technologyassessment capacity. Earlier this month, the new 116th Congress created a Select Committee on the Modernization of Congress. It will be chaired by Rep. Derek Kilmer (D-WA). Then the Open Government Data Actbecame law. This law will potentially scale the level of access to government data to unprecedented levels. It will require that all public-facing federal data must be machine-readable and reusable. This is a move in the right direction, and now comes the hard part.

Marci Harris, the CEO of civic startup Popvox, put it well, “The Foundations for Evidence-Based Policymaking (FEBP) Act, which includes the OPEN Government Data Act, lays groundwork for a more effective, accountable government. To realize the potential of these new resources, Congress will need to hire tech literate staff and incorporate real data and evidence into its oversight and legislative functions.”

In forsaking its own capacity for complex problem solving, Congress has become non-competitive in the creative process that moves society forward. During this same time period, all eyes turned toward Silicon Valley to fill the vacuum. With mass connection platforms and unlimited personal freedom, it seemed direct democracy had arrived. But that’s proved a bust. If we go by current trends, entrusting democracy to Silicon Valley will give us perfect laundry and fewer voting rights. Fixing democracy is a whole-of-nation challenge that Congress must lead.

Finally, we “the crowd” want a more effective governing body that incorporates our experience and perspective into the lawmaking process, not just feel-good form letters thanking us for our input. We also want a political discourse grounded in facts. A “modern” Congress will provide both, and now we have the institutional foundation in place to make it happen….(More)”.

From Human Rights Aspirations to Enforceable Obligations by Non-State Actors in the Digital Age: The Example of Internet Governance and ICANN


Paper by Monika Zalnieriute: “As the global policy-making capacity and influence of non-state actors in the digital age is rapidly increasing, the protection of fundamental human rights by private actors becomes one of the most pressing issues in Global Governance. This article combines business and human rights and digital constitutionalism discourses and uses the changing institutional context of Internet Governance and Internet Corporation for Assigned Names and Numbers (‘ICANN’) as an example to argue that economic incentives act against the voluntary protection of human rights by informal actors and regulatory structures in the digital era. It further contends that the global policy-making role and increasing regulatory power of informal actors such as ICANN necessitates reframing of their legal duties by subjecting them to directly binding human rights obligations in international law.

The article argues that such reframing is particularly important in the information age for three reasons. Firstly, it is needed to rectify an imbalance between hard legal commercial obligations and human rights soft law. This imbalance is well reflected in ICANNs policies. Secondly, binding obligations would ensure that individuals whose human rights have been affected can access an effective remedy. This is not envisaged under the new ICANN Bylaw on human rights precisely because of the fuzziness around the nature of ICANN’s obligations to respect internationally recognized human rights in its policies. Finally, the article suggests that because private actors such as ICANN are themselves engaging in the balancing exercise around such rights, an explicit recognition of their human rights obligations is crucial for the future development of access to justice in the digital age….(More)”.

The Discrete Charm of the Machine: Why the World Became Digital


Book by Ken Steiglitz: “A few short decades ago, we were informed by the smooth signals of analog television and radio; we communicated using our analog telephones; and we even computed with analog computers. Today our world is digital, built with zeros and ones. Why did this revolution occur? The Discrete Charm of the Machine explains, in an engaging and accessible manner, the varied physical and logical reasons behind this radical transformation.

The spark of individual genius shines through this story of innovation: the stored program of Jacquard’s loom; Charles Babbage’s logical branching; Alan Turing’s brilliant abstraction of the discrete machine; Harry Nyquist’s foundation for digital signal processing; Claude Shannon’s breakthrough insights into the meaning of information and bandwidth; and Richard Feynman’s prescient proposals for nanotechnology and quantum computing. Ken Steiglitz follows the progression of these ideas in the building of our digital world, from the internet and artificial intelligence to the edge of the unknown. Are questions like the famous traveling salesman problem truly beyond the reach of ordinary digital computers? Can quantum computers transcend these barriers? Does a mysterious magical power reside in the analog mechanisms of the brain? Steiglitz concludes by confronting the moral and aesthetic questions raised by the development of artificial intelligence and autonomous robots.

The Discrete Charm of the Machine examines why our information technology, the lifeblood of our civilization, became digital, and challenges us to think about where its future trajectory may lead….(More)”.

Achieving Digital Permanence


Raymond Blum with Betsy Beyer at ACM Queu: “Digital permanence has become a prevalent issue in society. This article focuses on the forces behind it and some of the techniques to achieve a desired state in which “what you read is what was written.” While techniques that can be imposed as layers above basic data stores—blockchains, for example—are valid approaches to achieving a system’s information assurance guarantees, this article won’t discuss them.

First, let’s define digital permanence and the more basic concept of data integrity.

Data integrity is the maintenance of the accuracy and consistency of stored information. Accuracy means that the data is stored as the set of values that were intended. Consistency means that these stored values remain the same over time—they do not unintentionally waver or morph as time passes.

Digital permanence refers to the techniques used to anticipate and then meet the expected lifetime of data stored in digital media. Digital permanence not only considers data integrity, but also targets guarantees of relevance and accessibility: the ability to recall stored data and to recall it with predicted latency and at a rate acceptable to the applications that require that information.

To illustrate the aspects of relevance and accessibility, consider two counterexamples: journals that were safely stored redundantly on Zip drives or punch cards may as well not exist if the hardware required to read the media into a current computing system isn’t available. Nor is it very useful to have receipts and ledgers stored on a tape medium that will take eight days to read in when you need the information for an audit on Thursday.

The Multiple Facets of Digital Permanence

Human memory is the most subjective record imaginable. Common adages and clichés such as “He said, she said,” “IIRC (If I remember correctly),” and “You might recall” recognize the truth of memories—that they are based only on fragments of the one-time subjective perception of any objective state of affairs. What’s more, research indicates that people alter their memories over time. Over the years, as the need to provide a common ground for actions based on past transactions arises, so does the need for an objective record of fact—an independent “true” past. These records must be both immutable to a reasonable degree and durable. Media such as clay tablets, parchment, photographic prints, and microfiche became popular because they satisfied the “write once, read many” requirement of society’s record keepers.

Information storage in the digital age has evolved to fit the scale of access (frequent) and volume (high) by moving to storage media that record and deliver information in an almost intangible state. Such media have distinct advantages: electrical impulses and the polarity of magnetized ferric compounds can be moved around at great speed and density. These media, unfortunately, also score higher in another measure: fragility. Paper and clay can survive large amounts of neglect and punishment, but a stray electromagnetic discharge or microscopic rupture can render a digital library inaccessible or unrecognizable.

It stands to reason that storing permanent records in some immutable and indestructible medium would be ideal—something that, once altered to encode information, could never be altered again, either by an overwrite or destruction. Experience shows that such ideals are rarely realized; with enough force and will, the hardest stone can be broken and the most permanent markings defaced.

In considering and ensuring digital permanence, you want to guard against two different failures: the destruction of the storage medium, and a loss of the integrity or “truthfulness” of the records….(More)”.

Decoding Algorithms


Malcalester University: “Ada Lovelace probably didn’t foresee the impact of the mathematical formula she published in 1843, now considered the first computer algorithm.

Nor could she have anticipated today’s widespread use of algorithms, in applications as different as the 2016 U.S. presidential campaign and Mac’s first-year seminar registration. “Over the last decade algorithms have become embedded in every aspect of our lives,” says Shilad Sen, professor in Macalester’s Math, Statistics, and Computer Science (MSCS) Department.

How do algorithms shape our society? Why is it important to be aware of them? And for readers who don’t know, what is an algorithm, anyway?…(More)”.

Leveraging and Sharing Data for Urban Flourishing


Testimony by Stefaan Verhulst before New York City Council Committee on Technology and the Commission on Public Information and Communication (COPIC): “We live in challenging times. From climate change to economic inequality, the difficulties confronting New York City, its citizens, and decision-makers are unprecedented in their variety, and also in their complexity and urgency. Our standard policy toolkit increasingly seems stale and ineffective. Existing governance institutions and mechanisms seem outdated and distrusted by large sections of the population.

To tackle today’s problems we need not only new solutions but also new methods for arriving at solutions. Data can play a central role in this task. Access to and the use of data in a trusted and responsible manner is central to meeting the challenges we face and enabling public innovation.

This hearing, called by the Technology Committee and the Commission on Public Information and Communication, is therefore timely and very important. It is my firm belief that rapid progress on developing an effective data sharing framework is among the most important steps our New York City leaders can take to tackle the myriad of 21st challenges....

I am joined today by some of my distinguished NYU colleagues, Prof. Julia Lane and Prof. Julia Stoyanovich, who have worked extensively on the technical and privacy challenges associated with data sharing. I will, therefore, avoid duplicating our testimonies and won’t focus on issues of privacy, trust and how to establish a responsible data sharing infrastructure, while these are central considerations for the type of data-driven approaches I will discuss. I am, of course, happy to elaborate on these topics during the question and answer session.

Instead, I want to focus on four core issues associated with data collaboration. I phrase these issues as answers to four questions. For each of these questions, I also provide a set of recommended actions that this Committee could consider undertaking or studying.

The four core questions are:

  • First, why should NYC care about data and data sharing?
  • Second, if you build a data-sharing framework, will they come?
  • Third, how can we best engage the private sector when it comes to sharing and using their data?
  • And fourth, is technology is the main (or best) answer?…(More)”.

Digital mile-markers provide navigation in cities


Springwise: “UK-based Maynard Design Consultancy has developed a system to help people navigate the changing landscape of city neighbourhoods. A prototype of a wayfinding solution for districts in London combines smart physical markers and navigational apps. The physical markers, inspired by traditional mile markers, include a digital screen. They provide real-time information, including daily news and messages from local businesses. The markers also track how people use the park, providing valuable information to the city and urban planners. The partnering apps provide up-to-date information about the changing environment in the city, such as on-going construction and delays due to large-scale events.

Unlike traditional, smartphone based navigational apps, this concept uses technology to help us reconnect with our surroundings, Maynard Design said.

The proposal won the Smart London District Challenge competition set by the Institute for Sustainability. Maynard is currently looking for partner companies to pilot its concept.

Takeaway: The Maynard design represents the latest efforts to use smartphones to amplify public safety announcements, general information and local businesses. The concept moves past traditional wayfinding markers to link people to a smart-city grid. By tracking how people use parks and other urban spaces, the markers will provide valuable insight for city officials. We expect more innovations like this as cities increasingly move toward seamless communication between services and city residents, aided by smart technologies. Over the past several months, we have seen technology to connect drivers to parking spaces and a prototype pavement that can change functions based on people’s needs….(More)”

The Future of FOIA in an Open Government World: Implications of the Open Government Agenda for Freedom of Information Policy and Implementation


Paper by Daniel Berliner, Alex Ingrams and Suzanne J. Piotrowski: “July 4, 2016 marked the fiftieth anniversary of the 1966 Freedom of Information Act of the United States. Freedom of Information (FOI) has become a vital element of the American political process, become recognized as a core value of democracy, and helped to inspire similar laws and movements around the world. FOI has always faced myriad challenges, including resistance, evasion, and poor implementation and enforcement. Yet the last decade has brought a change of a very different form to the evolution of FOI policy—the emergence of another approach to transparency that is in some ways similar to FOI, and in other ways distinct: open government. The open government agenda, driven by technological developments and motivated by a broader conception of transparency, today rivals, or by some measures, even eclipses FOI in terms of political attention and momentum. What have been the consequences of these trends? How does the advent of new technologies and new agendas shape the transparency landscape?

The political and policy contexts for FOI have fundamentally shifted due to the rise of the open government reform agenda. FOI was at one point the primary tool used to promote governance transparency. FOI is now just one good governance tool in an increasingly crowded field of transparency policy areas. Focus is increasingly shifting toward technology-enabled open data reforms. While many open government reformers see these as positive developments, many traditional FOI proponents have raised concerns. With a few notable exceptions, the academic literature has been silent on this issue. We offer a systematic framework for understanding the potential consequences—both positive and negative—of the open government agenda for FOI policy and implementation….(More)”.