Stefaan Verhulst
Book by Alexis Wichowski: “… considers the unchecked rise of tech giants like Facebook, Google, Amazon, Apple, Microsoft, and Tesla—what she calls “net states”— and their unavoidable influence in our lives. Rivaling nation states in power and capital, today’s net states are reaching into our physical world, inserting digital services into our lived environments in ways both unseen and, at times, unknown to us. They are transforming the way the world works, putting our rights up for grabs, from personal privacy to national security.
Combining original reporting and insights drawn from more than 100 interviews with technology and government insiders, including Microsoft president Brad Smith, Google CEO Eric Schmidt, the former Federal Trade Commission chair under President Obama, and the managing director of Jigsaw—Google’s Department of Counter-terrorism against extremism and cyber-attacks—The Information Trade explores what happens we give up our personal freedom and individual autonomy in exchange for an easy, plugged-in existence, and shows what we can do to control our relationship with net states before they irreversibly change our future….(More)“
Paper by Barteld Braaksma and Kees Zeelenberg: “In this paper, we describe and discuss opportunities for big data in official statistics. Big data come in high volume, high velocity and high variety. Their high volume may lead to better accuracy and more details, their high velocity may lead to more frequent and more timely statistical estimates, and their high variety may give opportunities for statistics in new areas. But there are also many challenges: there are uncontrolled changes in sources that threaten continuity and comparability, and data that refer only indirectly to phenomena of statistical interest.
Furthermore, big data may be highly volatile and selective: the coverage of the population to which they refer may change from day to day, leading to inexplicable jumps in time-series. And very often, the individual observations in these big data sets lack variables that allow them to be linked to other datasets or population frames. This severely limits the possibilities for correction of selectivity and volatility. Also, with the advance of big data and open data, there is much more scope for disclosure of individual data, and this poses new problems for statistical institutes. So, big data may be regarded as so-called nonprobability samples. The use of such sources in official statistics requires other approaches than the traditional one based on surveys and censuses.
A first approach is to accept the big data just for what they are: an imperfect, yet very timely, indicator of developments in society. In a sense, this is what national statistical institutes (NSIs) often do: we collect data that have been assembled by the respondents and the reason why, and even just the fact that they have been assembled is very much the same reason why they are interesting for society and thus for an NSI to collect. In short, we might argue: these data exist and that’s why they are interesting.
A second approach is to use formal models and extract information from these data. In recent years, many new methods for dealing with big data have been developed by mathematical and applied statisticians. New methods like machine-learning techniques can be considered alongside more traditional methods like Bayesian techniques. National statistical institutes have always been reluctant to use models, apart from specific cases like small-area estimates. Based on experience at Statistics Netherlands, we argue that NSIs should not be afraid to use models, provided that their use is documented and made transparent to users. On the other hand, in official statistics, models should not be used for all kinds of purposes….(More)”.
Inês Prates at apolitical: “…Evidence should feed into policymaking; there is no doubt about that. However, the truth is that using evidence in policy is often a very complex process and the stumbling blocks along the way are numerous.
The world has never had a larger wealth of data and information, and that is a great opportunity to open up public debate and democratise access to knowledge. At the same time, however, we are currently living in a “post-truth” era, where personal beliefs can trump scientific knowledge.
Technology and digital platforms have given room for populists to question well-established facts and evidence, and dangerously spread misinformation, while accusing scientists and policymakers of elitism for their own political gain.
Another challenge is that political interests can strategically manipulate or select (“cherry-pick”) evidence that justifies prearranged positions. A stark example of this is the evidence “cherry-picking” done by climate change sceptics who choose restricted time periods (for example of 8 to 12 years) that may not show a global temperature increase.
In addition, to unlock the benefits of evidence informed policy, we need to bridge the “policy-research gap”. Policymakers are not always aware of the latest evidence on an issue. Very often, critical decisions are made under a lot of pressure and the very nature of democracy makes policy complex and messy, making it hard to systematically integrate evidence into the process.
At the same time, researchers may be oblivious to what the most pressing policy challenges are, or how to communicate actionable insights to a non-expert audience. This constructive guide provides tips on how scientists can handle the most challenging aspects of engaging with policymakers.
Institutions like the European Commission’s in-house science service, the Joint Research Centre (JRC) sit precisely at the intersection between science and policy. Researchers from the JRC work together with policymakers on several key policy challenges. A nice example is their work on the scarcity of critical raw materials needed for the EU’s energy transition, using a storytelling tool to raise the awareness of non-experts on an extremely complex issue.
Lastly, we cannot forget about the importance of the buy-in from the public. Although policymakers can willingly ignore or manipulate evidence, they have very little incentives to ignore the will of a critical mass. Let us go back to the climate movement; it is hard to dismiss the influence of the youth-led worldwide protests on world leaders and their climate policy efforts.
Using evidence in policymaking is key to solving the world’s most pressing climate and environmental challenges. To do so effectively, we need to connect and establish trust between government, researchers and the public…(More)”.
Future of Privacy Forum: “Today, FPF is publishing a white paper co-authored by CEO Jules Polonetsky and hackylawyER Founder Elizabeth Renieris to help corporate officers, nonprofit leaders, and policymakers better understand privacy risks that will grow in prominence during the 2020s, as well as rising technologies that will be used to help manage privacy through the decade. Leaders must understand the basics of technologies like biometric scanning, collaborative robotics, and spatial computing in order to assess how existing and proposed policies, systems, and laws will address them, and to support appropriate guidance for the implementation of new digital products and services.
The white paper, Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade, identifies ten technologies that are likely to create increasingly complex data protection challenges. Over the next decade, privacy considerations will be driven by innovations in tech linked to human bodies, health, and social networks; infrastructure; and computing power. The white paper also highlights ten developments that can enhance privacy – providing cause for optimism that organizations will be able to manage data responsibly. Some of these technologies are already in general use, some will soon be widely deployed, and others are nascent….(More)”.
Article by Matthew S. Williams: “Bill Joy, the famed computer engineer who co-founded Sun Microsystems in 1982, once said, “No matter who you are, most of the smartest people work for someone else.” This has come to be known as “Joy’s Law” and is one of the inspirations for concepts such as “crowdsourcing”.
Increasingly, government agencies, research institutions, and private companies are looking to the power of the crowd to find solutions to problems. Challenges are created and prizes offered – that, in basic terms, is an “incentive competition.”
The basic idea of an incentive competition is pretty straightforward. When confronted with a particularly daunting problem, you appeal to the general public to provide possible solutions and offer a reward for the best one. Sounds simple, doesn’t it?
But in fact, this concept flies in the face of conventional problem-solving, which is for companies to recruit people with knowledge and expertise and solve all problems in-house. This kind of thinking underlies most of our government and business models, but has some significant limitations….
Another benefit to crowdsourcing is the way it takes advantage of the exponential growth in human population in the past few centuries. Between 1650 and 1800, the global population doubled, to reach about 1 billion. It took another one-hundred and twenty years (1927) before it doubled again to reach 2 billion.
However, it only took fifty-seven years for the population to double again and reach 4 billion (1974), and just fifteen more for it to reach 6 billion. As of 2020, the global population has reached 7.8 billion, and the growth trend is expected to continue for some time.
This growth has paralleled another trend, the rapid development of new ideas in science and technology. Between 1650 and 2020, humanity has experienced multiple technological revolutions, in what is a comparatively very short space of time….(More)”.
Paper by David Jensen, Karen Bakker and Christopher Reimer: “As outlined in our recent article, The promise and peril of a digital ecosystem for the planet, we propose that the ongoing digital revolution needs to be harnessed to drive a transformation towards global sustainability, environmental stewardship, and human well-being. Public, private and civil society actors must take deliberate action and collaborate to build a global digital ecosystem for the planet. A digital ecosystem that mobilizes hardware, software and digital infrastructures together with data analytics to generate dynamic, real-time insights that can power various structural transformations are needed to achieve collective sustainability.
The digital revolution must also be used to abolish extreme poverty and reduce inequalities that jeopardize social cohesion and stability. Often, these social inequalities are tied to and overlap with ecological challenges. Ultimately, then, we must do nothing less than direct the digital revolution for planet, people, prosperity and peace.
To achieve this goal, we must embed the vision of a fair digital ecosystem for the planet into all of the key multi-stakeholder processes that are currently unfolding. We aim to do this through two new articles on Medium: a companion article on Building a digital ecosystem for the planet: 20 substantive priorities for 2020, and this one. In the companion article, we identify three primary engagement tracks: system architecture, applications, and governance. Within these three tracks, we outline 20 priorities for the new decade. Building from these priorities, our focus for this article is to identify a preliminary list of the top 20 most important multi-stakeholder processes that we must engage and influence in 2020….(More).
Sharon Moshavi at Columbia Journalism Review: “News has migrated from print to the web to social platforms to mobile. Now, at the dawn of a new decade, it is heading to a place that presents a whole new set of challenges: the private, hidden spaces of instant messaging apps.
WhatsApp, Facebook Messenger, Telegram, and their ilk are platforms that journalists cannot ignore — even in the US, where chat-app usage is low. “I believe a privacy-focused communications platform will become even more important than today’s open platforms,” Mark Zuckerberg, Facebook’s CEO, wrote in March 2019. By 2022, three billion people will be using them on a regular basis, according to Statista.
But fewer journalists worldwide are using these platforms to disseminate news than they were two years ago, as ICFJ discovered in its 2019 “State of Technology in Global Newsrooms” survey. That’s a particularly dangerous trend during an election year, because messaging apps are potential minefields of misinformation.
American journalists should take stock of recent elections in India and Brazil, ahead of which misinformation flooded WhatsApp. ICFJ’s “TruthBuzz” projects found coordinated and widespread disinformation efforts using text, videos, and photos on that platform.
It is particularly troubling given that more people now use it as a primary source for information. In Brazil, one in four internet users consult WhatsApp weekly as a news source. A recent report from New York University’s Center for Business and Human Rights warned that WhatsApp “could become a troubling source of false content in the US, as it has been during elections in Brazil and India.” It’s imperative that news media figure out how to map the contours of these opaque, unruly spaces, and deliver fact-based news to those who congregate there….(More)”.
Report by The Stanley Center for Peace and Security: “Geospatial and open source analysts face decisions in their work that can directly or indirectly cause harm to individuals, organizations, institutions, and society. Though analysts may try to do the right thing, such ethically-informed decisions can be complex. This is particularly true for analysts working on issues related to nuclear nonproliferation or international security, analysts whose decisions on whether to publish certain findings could have far-reaching consequences.
The Stanley Center for Peace and Security and the Open Nuclear Network (ONN) program of One Earth Future Foundation convened a workshop to explore these ethical challenges, identify resources, and consider options for enhancing the ethical practices of geospatial and open source analysis communities.
This Readout & Recommendations brings forward observations from that workshop. It describes ethical challenges that stakeholders from relevant communities face. It concludes with a list of needs participants identified, along with possible strategies for promoting sustaining behaviors that could enhance the ethical conduct of the community of nonproliferation analysts working with geospatial and open source data.
Some Key Findings
- A code of ethics could serve important functions for the community, including giving moral guidance to practitioners, enhancing public trust in their work, and deterring unethical behavior. Participants in the workshop saw a significant value in such a code and offered ideas for developing one.
- Awareness of ethical dilemmas and strong ethical reasoning skills are essential for sustaining ethical practices, yet professionals in this field might not have easy access to such training. Several approaches could improve ethics education for the field overall, including starting a body of literature, developing model curricula, and offering training for students and professionals.
- Other stakeholders—governments, commercial providers, funders, organizations, management teams, etc.—should contribute to the discussion on ethics in the community and reinforce sustaining behaviors….(More)”.
Paper by Andrew Doss, Jonas Bedford-Strohm and Leanne Erdberg Steadman: “This paper identifies three structural vacuums in catastrophe governance today that allow for the greatest risks humanity faces to be externalized from decision-making. To mitigate the impact of these risks, The Rheomesa (“fluid table”) provides (1) a deliberative decision-making process between currently siloed entities in various sectors managing the outcome of catastrophes, including government, the private sector, NGOs, IGOs, and hybrid entities, with (2) a prospective, long-term accountability and incentive mechanism that (3) comprehensively addresses the three interdependent tasks societies face surrounding catastrophes – prevention, response, and recovery….(More)”.
Book edited by Todd Davis and Seeta Pena Gangadharan: “Can new technology enhance local, national, and global democracy? Online Deliberation is the first book that attempts to sample the full range of work on online deliberation, forging new connections between academic research, web designers, and practitioners.
Since the most exciting innovations in deliberation have occurred outside of traditional institutions, and those involved have often worked in relative isolation from each other, research conducted on this growing field has to this point neglected the full perspective of online participation. This volume, an essential read for those working at the crossroads of computer and social science, illuminates the collaborative world of deliberation by examining diverse clusters of Internet communities….(More)”.