Tech for disabled people is booming around the world. So where’s the funding?


Article by Devi Lockwood: “Erick Ponce works in a government communications department in northern Ecuador. The 26-year-old happens to be deaf — a disability he has had since childhood. Communicating fluidly with his non-signing colleagues at work, and in public spaces like the supermarket, has been a lifelong challenge. 

In 2017, Ponce became one of the first users of an experimental app called SpeakLiz, developed by an Ecuadorian startup called Talov. It transforms written text to sound, transcribes spoken words, and can alert a deaf or hard-of-hearing person to sounds like that of an ambulance, motorcycles, music, or a crying baby. 

Once he began using SpeakLiz, Ponce’s coworkers — and his family — were able to understand him more easily. “You cannot imagine what it feels like to speak with your son after 20 years,” his father told the app’s engineers. Now a part of the Talov team, Ponce demos new products to make them better before they hit the market. 

The startup has launched two subscription apps on iOS and Android: SpeakLiz, in 2017, for the hearing impaired, and Vision, in 2019, for the visually impaired. Talov’s founders, Hugo Jácome and Carlos Obando, have been working on the apps for over five years. 

SpeakLiz and Vision are, by many measures, successful. Their software is used by more than 7,000 people in 81 countries and is available in 35 languages. The founders won an award from MIT Technology Review and a contest organized by the History Channel. Talov was named among the top 100 most innovative startups in Latin America in 2019. 

But the startup is still struggling. Venture capitalists aren’t knocking on its door. Jácome and Obando sold some of their possessions to raise enough money to launch, and the team has next to no funding to continue expanding.

Although the last few years have seen significant advances in technology and innovation for disabled people, critics say the market is undervalued….(More)”.

Lobbying in the 21st Century: Transparency, Integrity and Access


OECD Report: “Lobbying, as a way to influence and inform governments, has been part of democracy for at least two centuries, and remains a legitimate tool for influencing public policies. However, it carries risks of undue influence. Lobbying in the 21st century has also become increasingly complex, including new tools for influencing government, such as social media, and a wide range of actors, such as NGOs, think tanks and foreign governments. This report takes stock of the progress that countries have made in implementing the OECD Principles for Transparency and Integrity in Lobbying. It reflects on new challenges and risks related to the many ways special interest groups attempt to influence public policies, and reviews tools adopted by governments to effectively safeguard impartiality and fairness in the public decision-making process….(More)”.

Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis


A CDT Research report, entitled "Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis".
CDT Research report, entitled “Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis”.

Report by Dhanaraj Thakur and  Emma Llansó: “The ever-increasing amount of user-generated content online has led, in recent years, to an expansion in research and investment in automated content analysis tools. Scrutiny of automated content analysis has accelerated during the COVID-19 pandemic, as social networking services have placed a greater reliance on these tools due to concerns about health risks to their moderation staff from in-person work. At the same time, there are important policy debates around the world about how to improve content moderation while protecting free expression and privacy. In order to advance these debates, we need to understand the potential role of automated content analysis tools.

This paper explains the capabilities and limitations of tools for analyzing online multimedia content and highlights the potential risks of using these tools at scale without accounting for their limitations. It focuses on two main categories of tools: matching models and computer prediction models. Matching models include cryptographic and perceptual hashing, which compare user-generated content with existing and known content. Predictive models (including computer vision and computer audition) are machine learning techniques that aim to identify characteristics of new or previously unknown content….(More)”.

The Tragedy of Climate Change


Essay by Bryan Doerries: “How terrible it is to know when, in the end, knowing gains you nothing,” laments the blind prophet Tiresias in Sophocles’ Oedipus the King. Oedipus had summoned him to reveal the source of the pestilence and ecological disaster ravaging Thebes. But Tiresias knew that the king would reject the truth. Today’s climate scientists and epidemiologists can relate.

Like Tiresias, modern-day scientists know where the planet is headed and why. They found out not through prophecies, but through countless double-blind experiments, randomized trials, and rigorous peer review. Their evidence is unimpeachable, and the consensus among them is overwhelming. But their secular augury cannot seem to overcome the willful indifference of politicians or the public. Knowing gains them nothing, because so few are listening.

If there is a way for scientists to get through to people and their leaders, the key will be to change not what they say, but how they say it. The language of science is dispassionate by design. By contrast, the manifold crises our planet faces are urgent and intense, and the individual and collective decisions that are fueling those crises have high emotional and ethical stakes. A virulent pandemic has taken the lives of three million people. The Earth is in the throes of a sixth mass extinction. And the problems are set to escalate.

We need a language to convey the gravity and complexity of the global tragedy that is unfolding, and the ancient Greeks supply it. Their tragedies are stories of people learning too late (usually by milliseconds). Their characters doggedly pursue what they believe to be right, barely comprehending the forces they face – chance, fate, habits, governments, gods, the weather. By the time they do, the characters have unwittingly made an irreversible – and devastating – mistake.

For centuries, Greek tragedies have been viewed as pessimistic expressions of a fatalistic society, which depict the futility of fighting destiny. But, for the Greeks, the effect of these stories may have been counterintuitive. By showing people just how narrow and fleeting their power to determine their own future was, the tragedies discouraged apathy. Highlighting how devastating self-delusion can be encouraged awareness. And providing the language for describing difficult experiences enhanced agency….(More)”

Open data for improved land governance


Guide by the Land Portal: “This Open Up Guide on Land Governance is a resource  aimed to be used by governments from developing countries to collect and release land-related data to improve data quality, availability, accessibility and use for improved citizen engagement, decision making and innovation. It sets out:

  1. Key datasets for land management accountability, and how they should be collected, stored, shared and published for improving land governance and transparency;
  2. Good data policies and frameworks, including metadata, standards and governance frameworks if available;
  3. Existing gaps or challenges in the policies and frameworks; and
  4. Use cases from real-life examples to illustrate the potential impact and transformation this type of data can provide in local contexts.

The Open Up Guide has been prepared for use by national and local government agencies with a mandate for or an interest in making their land governance data open and available for others to re-use. Land governance data generally comprises the data and information that agencies collect as they carry out their core land administration functions of land tenure, use, development and value. Some countries already collect and manage their land governance data in open and re-usable formats. Others may be seeking advice on how to start, how to expand their activities or how to test what they do against best practice.

Open land governance data, published in accordance with a government’s law and regulations, provides efficient and transparent government services and enables individuals, communities and businesses to run their lives ethically and with integrity.

The Guide is also intended to assist communities monitoring whether environmental protections are being upheld, and to support rights claims over geographical areas inhabited for generations; and for civil society organisations that can make use of land governance data to understand patterns of land deals, support environmental and social advocacy, and investigate and address corruption….(More)”.

The value of data matching for public poverty initiatives: a local voucher program example


Paper by Sarah Giest, Jose M. Miotto and Wessel Kraaij: “The recent surge of data-driven methods in social policy have created new opportunities to assess existing poverty programs. The expectation is that the combination of advanced methods and more data can calculate the effectiveness of public interventions more accurately and tailor local initiatives accordingly. Specifically, nonmonetary indicators are increasingly being measured at micro levels in order to target social exclusion in combination with poverty. However, the multidimensional character of poverty, local context, and data matching pose challenges to data-driven analyses. By linking Dutch household-level data with policy-initiative-specific data at local level, we present an explorative study on the uptake of a local poverty pass. The goal is to unravel pass usage in terms of household income and location as well as the age of users. We find that income and age play a role in whether the pass is used, and usage differs per neighborhood. With this, the paper feeds into the discourse on how to operationalize and design data matching work in the multidimensional space of poverty and nonmonetary government initiatives….(More)”.

How COVID broke the evidence pipeline


Article by Helen Pearson: “It wasn’t long into the pandemic before Simon Carley realized we had an evidence problem. It was early 2020, and COVID-19 infections were starting to lap at the shores of the United Kingdom, where Carley is an emergency-medicine doctor at hospitals in Manchester. Carley is also a specialist in evidence-based medicine — the transformative idea that physicians should decide how to treat people by referring to rigorous evidence, such as clinical trials.

As cases of COVID-19 climbed in February, Carley thought that clinicians were suddenly abandoning evidence and reaching for drugs just because they sounded biologically plausible. Early studies Carley saw being published often lacked control groups or enrolled too few people to draw firm conclusions. “We were starting to treat patients with these drugs initially just on what seemed like a good idea,” he says. He understood the desire to do whatever is possible for someone gravely ill, but he also knew how dangerous it is to assume a drug works when so many promising treatments prove to be ineffective — or even harmful — in trials. “The COVID-19 pandemic has arguably been one of the greatest challenges to evidence-based medicine since the term was coined in the last century,” Carley and his colleagues wrote of the problems they were seeing1.

Other medical experts echo these concerns. With the pandemic now deep into its second year, it’s clear the crisis has exposed major weaknesses in the production and use of research-based evidence — failures that have inevitably cost lives. Researchers have registered more than 2,900 clinical trials related to COVID-19, but the majority are too small or poorly designed to be of much use (see ‘Small samples’). Organizations worldwide have scrambled to synthesize the available evidence on drugs, masks and other key issues, but can’t keep up with the outpouring of new research, and often repeat others’ work. There’s been “research waste at an unprecedented scale”, says Huseyin Naci, who studies health policy at the London School of Economics….(More)”.

To Map Billions of Cicadas, It Takes Thousands of Citizen Scientists


Article by Linda Poon and Marie Patino: “At the end of May, Dan Mozgai will spend his vacation from his day job chasing cicadas. The bugs won’t be hard to find; in about a week, billions of the beady-eyed crawlers from Brood X will start coming up from their 17-year-long underground, blanketing parts of 15 states in the Northeast, Mid-Atlantic and Midwest with their cacophony of shrill mating calls. 

Mozgai isn’t an entomologist — he does online marketing for DirecTV. But since2007, he’s worked closely with academic researchers to track various broods of periodical cicadas,as part of one of the oldest citizen science efforts in the U.S. 

He’ll be joined by ten of thousands of other volunteers across the Brood X territory who will use the mobile app Cicada Safari, where userscan add geotagged photos and videos onto a live map, as dozens of student researchers behind the scenes verify each submission. Videos will be especially helpful this year, as it provides audio data for the researchers, says Gene Kritsky, an entomologist at Mount St. Joseph University in Cincinnati, and the creator behind Cicada Safari. He’s been testing the new app with smaller broods for two years in anticipation for this moment. https://0b26ee1773bac5736a29111147e28a6b.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

Brood X,  is one of the largest, and mostly broadly distributed geographically, of periodical cicadas, which emerge every 13 or 17 years. They’ll stick around for just a few weeks, through June, to mate and lay eggs.

“With the smartphone technology and the GPS location services, it was just a perfect way to do citizen science,” Kritsky says. Some 87,000 people have signed up as of the beginning of May, and they’ve already documented several early risers, especially around Cincinnati and Washington, D.C. — two of the expected hotspot…(More)”.

The Filing Cabinet


Essay by Craig Robertson: “The filing cabinet was critical to the information infrastructure of the 20th-century. Like most infrastructure, it was usually overlooked….The subject of this essay emerged by chance. I was researching the history of the U.S. passport, and had spent weeks at the National Archives, struggling through thousands of reels of unindexed microfilm records of 19th-century diplomatic correspondence; then I arrived at the records for 1906. That year, the State Department adopted a numerical filing system. Suddenly, every American diplomatic office began using the same number for passport correspondence, with decimal numbers subdividing issues and cases. Rather than scrolling through microfilm images of bound pages organized chronologically, I could go straight to passport-relevant information that had been gathered in one place.

I soon discovered that I had Elihu Root to thank for making my research easier. A lawyer whose clients included Andrew Carnegie, Root became secretary of state in 1905. But not long after he arrived, the prominent corporate lawyer described himself as “a man trying to conduct the business of a large metropolitan law-firm in the office of a village squire.”  The department’s record-keeping practices contributed to his frustration. As was then common in American offices, clerks used press books or copybooks to store incoming and outgoing correspondence in chronologically ordered bound volumes with limited indexing. For Root, the breaking point came when a request for a handful of letters resulted in several bulky volumes appearing on his desk. His response was swift: he demanded that a vertical filing system be adopted; soon the department was using a numerical subject-based filing system housed in filing cabinets. 

The shift from bound volumes to filing systems is a milestone in the history of classification; the contemporaneous shift to vertical filing cabinets is a milestone in the history of storage….(More)”.

Side-Stepping Safeguards, Data Journalists Are Doing Science Now


Article by Irineo Cabreros: “News stories are increasingly told through data. Witness the Covid-19 time series that decorate the homepages of every major news outlet; the red and blue heat maps of polling predictions that dominate the runup to elections; the splashy, interactive plots that dance across the screen.

As a statistician who handles data for a living, I welcome this change. News now speaks my favorite language, and the general public is developing a healthy appetite for data, too.

But many major news outlets are no longer just visualizing data, they are analyzing it in ever more sophisticated ways. For example, at the height of the second wave of Covid-19 cases in the United States, The New York Times ran a piece declaring that surging case numbers were not attributable to increased testing rates, despite President Trump’s claims to the contrary. The thrust of The Times’ argument was summarized by a series of plots that showed the actual rise in Covid-19 cases far outpacing what would be expected from increased testing alone. These weren’t simple visualizations; they involved assumptions and mathematical computations, and they provided the cornerstone for the article’s conclusion. The plots themselves weren’t sourced from an academic study (although the author on the byline of the piece is a computer science Ph.D. student); they were produced through “an analysis by The New York Times.”

The Times article was by no means an anomaly. News outlets have asserted, on the basis of in-house data analyses, that Covid-19 has killed nearly half a million more people than official records report; that Black and minority populations are overrepresented in the Covid-19 death toll; and that social distancing will usually outperform attempted quarantine. That last item, produced by The Washington Post and buoyed by in-house computer simulations, was the most read article in the history of the publication’s website, according to Washington Post media reporter Paul Farhi.

In my mind, a fine line has been crossed. Gone are the days when science journalism was like sports journalism, where the action was watched from the press box and simply conveyed. News outlets have stepped onto the field. They are doing the science themselves….(More)”.