Stefaan Verhulst
Paper by Monika Zalnieriute: “As the global policy-making capacity and influence of non-state actors in the digital age is rapidly increasing, the protection of fundamental human rights by private actors becomes one of the most pressing issues in Global Governance. This article combines business and human rights and digital constitutionalism discourses and uses the changing institutional context of Internet Governance and Internet Corporation for Assigned Names and Numbers (‘ICANN’) as an example to argue that economic incentives act against the voluntary protection of human rights by informal actors and regulatory structures in the digital era. It further contends that the global policy-making role and increasing regulatory power of informal actors such as ICANN necessitates reframing of their legal duties by subjecting them to directly binding human rights obligations in international law.
The article argues that such reframing is particularly important in the information age for three reasons. Firstly, it is needed to rectify an imbalance between hard legal commercial obligations and human rights soft law. This imbalance is well reflected in
Book by Ken Steiglitz: “A few short decades ago, we were informed by the smooth signals of analog television and radio; we communicated using our analog telephones
The spark of individual genius shines through this story of innovation: the stored program of Jacquard’s loom; Charles Babbage’s logical branching; Alan Turing’s brilliant abstraction of the discrete machine; Harry Nyquist’s foundation for digital signal processing; Claude Shannon’s breakthrough insights into the meaning of information and bandwidth; and Richard Feynman’s prescient proposals for nanotechnology and quantum computing. Ken Steiglitz follows the progression of these ideas in the building of our digital world, from the internet and artificial intelligence to the edge of the unknown. Are questions like the famous traveling salesman problem truly beyond the reach of ordinary digital computers? Can quantum computers transcend these barriers? Does a mysterious magical power reside in the analog mechanisms of the brain? Steiglitz concludes by confronting the moral and aesthetic questions raised by the development of artificial intelligence and autonomous robots.
The Discrete Charm of the Machine examines why our information technology, the lifeblood of our civilization, became
Raymond Blum with Betsy Beyer at ACM
First, let’s define digital permanence and the more basic concept of data integrity.
Data integrity is the maintenance of the accuracy and consistency of stored information. Accuracy means that the data is stored as the set of values that were intended. Consistency means that these stored values remain the same over time—they do not unintentionally waver or morph as time passes.
Digital permanence refers to the techniques used to anticipate and then meet the expected lifetime of data stored in digital media. Digital permanence not only considers data integrity, but also targets guarantees of relevance and accessibility: the ability to recall stored data and to recall it with predicted latency and at a rate acceptable to the applications that require that information.
To illustrate the aspects of relevance and accessibility, consider two counterexamples: journals that were safely stored redundantly on Zip drives or punch cards may as well not exist if the hardware required to read the media into a current computing system isn’t available. Nor is it very useful to have receipts and ledgers stored on a tape medium that will take eight days to read in when you need the information for an audit on Thursday.
The Multiple Facets of Digital Permanence
Human memory is the most subjective record imaginable. Common adages and clichés such as “He said, she said,” “IIRC (If I remember correctly),” and “You might recall” recognize the truth of memories—that they are based only on fragments of the one-time subjective perception of any objective state of affairs. What’s more, research indicates that people alter their memories over time. Over the years, as the need to provide a common ground for actions based on past transactions arises, so does the need for an objective record of fact—an independent “true” past. These records must be both immutable to a reasonable degree and durable. Media such as clay tablets, parchment, photographic prints, and microfiche became popular because they satisfied the “write once, read many” requirement of society’s record keepers.
Information storage in the digital age has evolved to fit the scale of access (frequent) and volume (high) by moving to storage media that record and deliver information in an almost intangible state. Such media have distinct advantages: electrical impulses and the polarity of magnetized ferric compounds can be moved around at great speed and density. These media, unfortunately, also score higher in another measure: fragility. Paper and clay can survive large amounts of neglect and punishment, but a stray electromagnetic discharge or microscopic rupture can render a digital library inaccessible or unrecognizable.
It stands to reason that storing permanent records in some immutable and indestructible medium would be ideal—something that, once altered to encode information, could never be altered again, either by an overwrite or destruction. Experience shows that such ideals are rarely realized; with enough force and will, the hardest stone can be broken and the most permanent markings defaced.
In considering and ensuring digital permanence, you want to guard against two different failures: the destruction of the storage medium, and a loss of the integrity or “truthfulness” of the records
Malcalester University: “Ada Lovelace probably didn’t foresee the impact of the mathematical formula she published in 1843, now considered the first computer algorithm.
Nor could she have anticipated today’s widespread use of algorithms, in applications as different as the 2016 U.S. presidential campaign and Mac’s first-year seminar registration. “Over the last decade algorithms have become embedded in every aspect of our lives,” says Shilad Sen, professor in Macalester’s Math, Statistics, and Computer Science (MSCS) Department.
How do algorithms shape our society? Why is it important to be aware of them? And for readers who don’t know, what is an algorithm, anyway?…(More)”.
Testimony by Stefaan Verhulst before New York City Council Committee on Technology and the Commission on Public Information and Communication (COPIC): “We live in challenging times. From climate change to economic inequality, the difficulties confronting New York City, its citizens, and decision-makers are unprecedented in their variety, and also in their complexity and urgency. Our standard policy toolkit increasingly seems stale and ineffective. Existing governance institutions and mechanisms seem outdated and distrusted by large sections of the population.
To tackle today’s problems we need not only new solutions but also new methods for arriving at solutions. Data can play a central role in this task. Access to and the use of data in a trusted and responsible manner is central to meeting the challenges we face and enabling public innovation.
This hearing, called by the Technology Committee and the Commission on Public Information and Communication, is therefore timely and very important. It is my firm belief that rapid progress on developing an effective data sharing framework is among the most important steps our New York City leaders can take to tackle the myriad of 21st challenges
I am joined today by some of my distinguished NYU colleagues, Prof. Julia Lane and Prof. Julia Stoyanovich, who have worked extensively on the technical and privacy challenges associated with data sharing. I will, therefore, avoid duplicating our testimonies and won’t focus on issues of privacy, trust and how to establish a responsible data sharing infrastructure, while these are central considerations for the type of data-driven approaches I will discuss. I am, of course, happy to elaborate on these topics during the question and answer session.
Instead, I want to focus on four core issues associated with data collaboration. I phrase these issues as answers to four questions. For each of these questions, I also provide a set of recommended actions that this Committee could consider undertaking or studying.
The four core questions are:
- First, why should NYC care about data and data sharing?
- Second, if you build a data-sharing framework, will they come?
- Third, how can we best engage the private sector when it comes to sharing and using their data?
- And fourth, is technology is the main (or best) answer
?…( More)”.
Springwise: “UK-based Maynard Design Consultancy has developed a system to help people navigate the changing landscape of city
Unlike traditional, smartphone based navigational apps, this concept uses technology to help us reconnect with our surroundings, Maynard Design said.
The proposal won the Smart London District Challenge competition set by the Institute for Sustainability. Maynard is currently looking for partner companies to pilot its concept.
Takeaway: The Maynard design represents the latest efforts to use smartphones to amplify public safety announcements, general information and local businesses. The concept moves past traditional wayfinding markers to link people to a smart-city grid. By tracking how people use parks and other urban spaces, the markers will provide valuable insight for city officials. We expect more innovations like this as cities increasingly move toward seamless communication between services and city residents, aided by smart technologies. Over the past several months, we have seen technology to connect drivers to parking spaces and a prototype pavement that can change functions based on people’s needs
The political and policy contexts for FOI have fundamentally shifted due to the rise of the open government reform agenda. FOI was at one point the primary tool used to promote governance transparency. FOI is now just one good governance tool in an increasingly crowded field of transparency policy areas.
Report by James G. McGann: “The Think Tanks and Civil Societies Program (TTCSP) of the Lauder Institute at the University of Pennsylvania conducts research on the role policy institutes play in governments and civil societies around the world. Often referred to as the “think tanks’ think tank,” TTCSP examines the evolving role and character of public policy research organizations. Over the last 27 years, the TTCSP has developed and led a series of global initiatives that have helped bridge the gap between knowledge and policy in critical policy areas such as international peace and security, globalization and governance, international economics, environmental issues, information and society, poverty alleviation, and healthcare and global health. These international collaborative efforts are designed to establish regional and international networks of policy institutes and communities that improve
The TTCSP works with leading scholars and practitioners from think tanks and universities in a variety of collaborative efforts and programs, and produces the annual Global Go To
Our model demonstrates high diagnostic accuracy across multiple organ systems and is comparable to experienced pediatricians in diagnosing common childhood diseases. Our study provides a proof of concept for implementing an AI-based system as a means to aid physicians in tackling large amounts of data, augmenting diagnostic evaluations, and to provide clinical decision support in cases of diagnostic uncertainty or complexity. Although this impact may be most evident in areas where healthcare providers are in relative shortage, the benefits of such an AI system are likely to be universal….(More)”.