Incentivising open ecological data using blockchain technology


Paper by Robert John Lewis, Kjell-Erik Marstein & John-Arvid Grytnes: “Mindsets concerning data as proprietary are common, especially where data production is resource intensive. Fears of competing research in concert with loss of exclusivity to hard earned data are pervasive. This is for good reason given that current reward structures in academia focus overwhelmingly on journal prestige and high publication counts, and not accredited publication of open datasets. And, then there exists reluctance of researchers to cede control to centralised repositories, citing concern over the lack of trust and transparency over the way complex data are used and interpreted.

To begin to resolve these cultural and sociological constraints to open data sharing, we as a community must recognise that top-down pressure from policy alone is unlikely to improve the state of ecological data availability and accessibility. Open data policy is almost ubiquitous (e.g. the Joint Data Archiving Policy, (JDAP) http://datadryad.org/pages/jdap) and while cyber-infrastructures are becoming increasingly extensive, most have coevolved with sub-disciplines utilising high velocity, born digital data (e.g. remote sensing, automated sensor networks and citizen science). Consequently, they do not always offer technological solutions that ease data collation, standardisation, management and analytics, nor provide a good fit culturally to research communities working among the long-tail of ecological science, i.e. science conducted by many individual researchers/teams over limited spatial and temporal scales. Given the majority of scientific funding is spent on this type of dispersed research, there is a surprisingly large disconnect between the vast majority of ecological science and the cyber-infrastructures to support open data mandates, offering a possible explanation to why primary ecological data are reportedly difficult to find…(More)”.

Scaling deep through transformative learning in public sector innovation labs – experiences from Vancouver and Auckland


Article by Lindsay Cole & Penny Hagen: “…explores scaling deep through transformative learning in Public Sector Innovation Labs (PSI labs) as a pathway to increase the impacts of their work. Using literature review and participatory action research with two PSI labs in Vancouver and Auckland, we provide descriptions of how they enact transformative learning and scaling deep. A shared ambition for transformative innovation towards social and ecological wellbeing sparked independent moves towards scaling deep and transformative learning which, when compared, offer fruitful insights to researchers and practitioners. The article includes a PSI lab typology and six moves to practice transformative learning and scaling deep…(More)”.

Wartime Digital Resilience


Article by Gulsanna Mamediieva: “Prior to Russia’s invasion of Ukraine, technology was already a growing part of the Ukrainian economy and was central to the government’s vision to reimagine the way citizens and businesses interact with the state in the digital era: paperless, cashless, and without bureaucracy. Even before the conflict, we in government believed that technology holds the promise of making government more transparent, efficient, and accountable, empower citizens, increase participation, and combat corruption.

However, technology has become even more central to helping the country defend itself and mitigate the effect of Russian attacks on civilians. As a result, Ukraine has emerged as a leading example of digital innovation and resilience in the face of challenges, particularly through its gov-tech solutions, using digital governance capacities to maintain basic governance functions in crisis situations and showing a strong case for digital public innovation to support its people. Digital government plays a central role in Ukraine’s ability to continue to fight for its very existence and respond to the aggressor…(More)”

Protests


Paper by Davide Cantoni, Andrew Kao, David Y. Yang & Noam Yuchtman: “Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work.Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work.Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work…(More)”.

Designing Research For Impact


Blog by Duncan Green: “The vast majority of proposals seem to conflate impact with research dissemination (a heroic leap of faith – changing the world one seminar at a time), or to outsource impact to partners such as NGOs and thinktanks.

Of the two, the latter looks more promising, but then the funder should ask to see both evidence of genuine buy-in from the partners, and appropriate budget for the work. Bringing in a couple of NGOs as ‘bid candy’ with little money attached is unlikely to produce much impact.

There is plenty written on how to genuinely design research for impact, e.g. this chapter from a number of Oxfam colleagues on its experience, or How to Engage Policy Makers with your Research (an excellent book I reviewed recently and on the LSE Review of Books). In brief, proposals should:

  • Identify the kind(s) of impacts being sought: policy change, attitudinal shifts (public or among decision makers), implementation of existing laws and policies etc.
  • Provide a stakeholder mapping of the positions of key players around those impacts – supporters, waverers and opponents.
  • Explain how the research plans to target some/all of these different individuals/groups, including during the research process itself (not just ‘who do we send the papers to once they’re published?’).
  • Which messengers/intermediaries will be recruited to convey the research to the relevant targets (researchers themselves are not always the best-placed to persuade them)
  • Potential ‘critical junctures’ such as crises or changes of political leadership that could open windows of opportunity for uptake, and how the research team is set up to spot and respond to them.
  • Anticipated attacks/backlash against research on sensitive issues and how the researchers plan to respond
  • Plans for review and adaptation of the influencing strategy

I am not arguing for proposals to indicate specific impact outcomes – most systems are way too complex for that. But, an intentional plan based on asking questions on the points above would probably help researchers improve their chances of impact.

Based on the conversations I’ve been having, I also have some thoughts on what is blocking progress.

Impact is still too often seen as an annoying hoop to jump through at the funding stage (and then largely forgotten, at least until reporting at the end of the project). The incentives are largely personal/moral (‘I want to make a difference’), whereas the weight of professional incentives are around accumulating academic publications and earning the approval of peers (hence the focus on seminars).

incentives are largely personal/moral (‘I want to make a difference’), whereas the weight of professional incentives are around accumulating academic publications

The timeline of advocacy, with its focus on ‘dancing with the system’, jumping on unexpected windows of opportunity etc, does not mesh with the relentless but slow pressure to write and publish. An academic is likely to pay a price if they drop their current research plans to rehash prior work to take advantage of a brief policy ‘window of opportunity’.

There is still some residual snobbery, at least in some disciplines. You still hear terms like ‘media don’, which is not meant as a compliment. For instance, my friend Ha-Joon Chang is now an economics professor at SOAS, but what on earth was Cambridge University thinking not making a global public intellectual and brilliant mind into a prof, while he was there?

True, there is also some more justified concern that designing research for impact can damage the research’s objectivity/credibility – hence the desire to pull in NGOs and thinktanks as intermediaries. But, this conversation still feels messy and unresolved, at least in the UK…(More)”.

Advancing Environmental Justice with AI


Article by Justina Nixon-Saintil: “Given its capacity to innovate climate solutions, the technology sector could provide the tools we need to understand, mitigate, and even reverse the damaging effects of global warming. In fact, addressing longstanding environmental injustices requires these companies to put the newest and most effective technologies into the hands of those on the front lines of the climate crisis.

Tools that harness the power of artificial intelligence, in particular, could offer unprecedented access to accurate information and prediction, enabling communities to learn from and adapt to climate challenges in real time. The IBM Sustainability Accelerator, which we launched in 2022, is at the forefront of this effort, supporting the development and scaling of projects such as the Deltares Aquality App, an AI-powered tool that helps farmers assess and improve water quality. As a result, farmers can grow crops more sustainably, prevent runoff pollution, and protect biodiversity.

Consider also the challenges that smallholder farmers face, such as rising costs, the difficulty of competing with larger producers that have better tools and technology, and, of course, the devastating effects of climate change on biodiversity and weather patterns. Accurate information, especially about soil conditions and water availability, can help them address these issues, but has historically been hard to obtain…(More)”.

Experts: 90% of Online Content Will Be AI-Generated by 2026


Article by Maggie Harrison: “Don’t believe everything you see on the Internet” has been pretty standard advice for quite some time now. And according to a new report from European law enforcement group Europol, we have all the reason in the world to step up that vigilance.

“Experts estimate that as much as 90 percent of online content may be synthetically generated by 2026,” the report warned, adding that synthetic media “refers to media generated or manipulated using artificial intelligence.”

“In most cases, synthetic media is generated for gaming, to improve services or to improve the quality of life,” the report continued, “but the increase in synthetic media and improved technology has given rise to disinformation possibilities.”…

The report focused pretty heavily on disinformation, notably that driven by deepfake technology. But that 90 percent figure raises other questions, too — what do AI systems like Dall-E and GPT-3 mean for artists, writers, and other content-generating creators? And circling back to disinformation once more, what will the dissemination of information, not to mention the consumption of it, actually look like in an era driven by that degree of AI-generated digital stuff?…(More)’

Unlocking the value of supply chain data across industries


MIT Technology Review Insights: “The product shortages and supply-chain delays of the global covid-19 pandemic are still fresh memories. Consumers and industry are concerned that the next geopolitical climate event may have a similar impact. Against a backdrop of evolving regulations, these conditions mean manufacturers want to be prepared against short supplies, concerned customers, and weakened margins.

For supply chain professionals, achieving a “phygital” information flow—the blending of physical and digital data—is key to unlocking resilience and efficiency. As physical objects travel through supply chains, they generate a rich flow of data about the item and its journey—from its raw materials, its manufacturing conditions, even its expiration date—bringing new visibility and pinpointing bottlenecks.

This phygital information flow offers significant advantages, enhancing the ability to create rich customer experiences to satisfying environmental, social, and corporate governance (ESG) goals. In a 2022 EY global survey of executives, 70% of respondents agreed that a sustainable supply chain will increase their company’s revenue.

For disparate parties to exchange product information effectively, they require a common framework and universally understood language. Among supply chain players, data standards create a shared foundation. Standards help uniquely identify, accurately capture, and automatically share critical information about products, locations, and assets across trading communities…(More)”.

Digital Empires: The Global Battle to Regulate Technology


Book by Anu Bradford: “The global battle among the three dominant digital powers—the United States, China, and the European Union—is intensifying. All three regimes are racing to regulate tech companies, with each advancing a competing vision for the digital economy while attempting to expand its sphere of influence in the digital world. In Digital Empires, her provocative follow-up to The Brussels Effect, Anu Bradford explores a rivalry that will shape the world in the decades to come.

Across the globe, people dependent on digital technologies have become increasingly alarmed that their rapid adoption and transformation have ushered in an exceedingly concentrated economy where a few powerful companies control vast economic wealth and political power, undermine data privacy, and widen the gap between economic winners and losers. In response, world leaders are variously embracing the idea of reining in the most dominant tech companies. Bradford examines three competing regulatory approaches—the American market-driven model, the Chinese state-driven model, and the European rights-driven regulatory model—and discusses how governments and tech companies navigate the inevitable conflicts that arise when these regulatory approaches collide in the international domain. Which digital empire will prevail in the contest for global influence remains an open question, yet their contrasting strategies are increasingly clear.

Digital societies are at an inflection point. In the midst of these unfolding regulatory battles, governments, tech companies, and digital citizens are making important choices that will shape the future ethos of the digital society. Digital Empires lays bare the choices we face as societies and individuals, explains the forces that shape those choices, and illuminates the immense stakes involved for everyone who uses digital technologies….(More)”

AI and new standards promise to make scientific data more useful by making it reusable and accessible


Article by Bradley Wade Bishop: “…AI makes it highly desirable for any data to be machine-actionable – that is, usable by machines without human intervention. Now, scholars can consider machines not only as tools but also as potential autonomous data reusers and collaborators.

The key to machine-actionable data is metadata. Metadata are the descriptions scientists set for their data and may include elements such as creator, date, coverage and subject. Minimal metadata is minimally useful, but correct and complete standardized metadata makes data more useful for both people and machines.

It takes a cadre of research data managers and librarians to make machine-actionable data a reality. These information professionals work to facilitate communication between scientists and systems by ensuring the quality, completeness and consistency of shared data.

The FAIR data principles, created by a group of researchers called FORCE11 in 2016 and used across the world, provide guidance on how to enable data reuse by machines and humans. FAIR data is findable, accessible, interoperable and reusable – meaning it has robust and complete metadata.

In the past, I’ve studied how scientists discover and reuse data. I found that scientists tend to use mental shortcuts when they’re looking for data – for example, they may go back to familiar and trusted sources or search for certain key terms they’ve used before. Ideally, my team could build this decision-making process of experts and remove as many biases as possible to improve AI. The automation of these mental shortcuts should reduce the time-consuming chore of locating the right data…(More)”.