Blog by Basil Mahfouz: “Scientists worldwide published over 2.6 million papers in 2022 – Almost 5 papers per minute and more than double what they published in the year 2000. Are policy makers making the most of the wealth of available scientific knowledge? In this blog, we describe how we are applying data science methods on the bibliometric database of Elsevier’s International Centre for the Study of Research (ICSR) to analyse how scholarly research is being used by policy makers. More specifically, we will discuss how we are applying natural language processing and network dynamics to identify where there is policy action and also strong evidence; where there is policy interest but a lack of evidence; and where potential policies and strategies are not making full use of available knowledge or tools…(More)”.
Data Is Everybody’s Business
Book by Barbara H. Wixom, Cynthia M. Beath and Leslie Owens: “Most organizations view data monetization—converting data into money—too narrowly: as merely selling data sets. But data monetization is a core business activity for both commercial and noncommercial organizations, and, within organizations, it’s critical to have wide-ranging support for this pursuit. In Data Is Everybody’s Business, the authors offer a clear and engaging way for people across the entire organization to understand data monetization and make it happen. The authors identify three viable ways to convert data into money—improving work with data, wrapping products with data, and selling information offerings—and explain when to pursue each and how to succeed…(More)”.
Guess who’s getting the world’s first self-sovereign national digital ID?
Article by Durga M Sengupta: “Bhutan — a small Himalayan nation with less than 800,000 people — has decided to roll out a national digital identity system for all its citizens. “National digital ID is the platform on which digitization and online services of banks to hospitals to taxation to universities, everything can come online with 100% assurance,” Ujjwal Deep Dahal, CEO of Druk Holding and Investments, the commercial and investment arm of the government which developed the system, told me over a video call from the capital city of Thimphu.
The national ID system has been built using blockchain technology, which will provide each individual a “self-sovereign” identity, meaning it can only be controlled by the citizen and no other entity, similar to how cryptocurrencies work.
The country’s 7-year-old crown prince, Jigme Namgyel Wangchuck, was the first to enroll in the new system, and it is expected to reach the rest of the population within the year, Dahal said.
“Once I’m onboarded, the interesting part about self-sovereign identity is that only I have my verified credentials in my wallet, in my phone. Nobody has access to it thereon but me, not even the government,” he said. The onboarding process takes about 5 seconds, Dahal estimated. “In our system, you will not visit any booth to register yourself. You’ll just download an app; share your details, selfie, and national ID card; and in the back end, the AI algorithm will run and say, ‘Okay, I can give you a verified credential,’” he said. This timeline would differ for people who don’t have smartphones or require assistance.
Druk Holding and Investments has been instrumental in setting up various other parallel projects, including the recently announced Bhutanverse — a metaverse that displays Bhutanese art, architecture, and motifs…(More)”. See also: Field Report: On the Emergent Use of Distributed Ledger Technologies for Identity Management
It’s Official: Cars Are the Worst Product Category We Have Ever Reviewed for Privacy
Article by the Mozilla Foundation: “Car makers have been bragging about their cars being “computers on wheels” for years to promote their advanced features. However, the conversation about what driving a computer means for its occupants’ privacy hasn’t really caught up. While we worried that our doorbells and watches that connect to the internet might be spying on us, car brands quietly entered the data business by turning their vehicles into powerful data-gobbling machines. Machines that, because of their all those brag-worthy bells and whistles, have an unmatched power to watch, listen, and collect information about what you do and where you go in your car.
All 25 car brands we researched earned our *Privacy Not Included warning label — making cars the official worst category of products for privacy that we have ever reviewed…(More)”.
Incentivising open ecological data using blockchain technology
Paper by Robert John Lewis, Kjell-Erik Marstein & John-Arvid Grytnes: “Mindsets concerning data as proprietary are common, especially where data production is resource intensive. Fears of competing research in concert with loss of exclusivity to hard earned data are pervasive. This is for good reason given that current reward structures in academia focus overwhelmingly on journal prestige and high publication counts, and not accredited publication of open datasets. And, then there exists reluctance of researchers to cede control to centralised repositories, citing concern over the lack of trust and transparency over the way complex data are used and interpreted.
To begin to resolve these cultural and sociological constraints to open data sharing, we as a community must recognise that top-down pressure from policy alone is unlikely to improve the state of ecological data availability and accessibility. Open data policy is almost ubiquitous (e.g. the Joint Data Archiving Policy, (JDAP) http://datadryad.org/pages/jdap) and while cyber-infrastructures are becoming increasingly extensive, most have coevolved with sub-disciplines utilising high velocity, born digital data (e.g. remote sensing, automated sensor networks and citizen science). Consequently, they do not always offer technological solutions that ease data collation, standardisation, management and analytics, nor provide a good fit culturally to research communities working among the long-tail of ecological science, i.e. science conducted by many individual researchers/teams over limited spatial and temporal scales. Given the majority of scientific funding is spent on this type of dispersed research, there is a surprisingly large disconnect between the vast majority of ecological science and the cyber-infrastructures to support open data mandates, offering a possible explanation to why primary ecological data are reportedly difficult to find…(More)”.
The Importance of Purchase to Plate Data
Blog by Andrea Carlson and Thea Palmer Zimmerman: “…Because there can be economic and social barriers to maintaining a healthy diet, USDA promotes Food and Nutrition Security so that everyone has consistent and equitable access to healthy, safe, and affordable foods that promote optimal health and well-being. A set of data tools called the Purchase to Plate Suite (PPS) supports these goals by enabling the update of the Thrifty Food Plan (TFP), which estimates how much a budget-conscious family of four needs to spend on groceries to ensure a healthy diet. The TFP market basket – consisting of the specific amounts of various food categories required by the plan – forms the basis of the maximum allotment for the Supplemental Nutrition Assistance Program (SNAP, formerly known as the “Food Stamps” program), which provided financial support towards the cost of groceries for over 41 million individuals in almost 22 million households in fiscal year 2022.
The 2018 Farm Act (Agriculture Improvement Act of 2018) requires that USDA reevaluate the TFP every five years using current food composition, consumption patterns, dietary guidance, and food prices, and using approved scientific methods. USDA’s Economic Research Service (ERS) was charged with estimating the current food prices using retail food scanner data (Levin et al. 2018; Muth et al. 2016) and utilized the PPS for this task. The most recent TFP update was released in August 2021 and the revised cost of the market basket was the first non-inflation adjustment increase in benefits for SNAP in over 40 years (US Department of Agriculture 2021).
The PPS combines datasets to enhance research related to the economics of food and nutrition. There are four primary components of the suite:
- Purchase to Plate Crosswalk (PPC),
- Purchase to Plate Price Tool (PPPT),
- Purchase to Plate National Average Prices (PP-NAP) for the National Health and Nutrition Examination Survey (NHANES), and
- Purchase to Plate Ingredient Tool (PPIT)..(More)”
Private sector access to public sector personal data: exploring data value and benefit sharing
Literature review for the Scottish Government: “The aim of this review is to enable the Scottish Government to explore the issues relevant to the access of public sector personal data (as defined by the European Union General Data Protection Regulation, GDPR) with or by the private sector in publicly trusted ways, to unlock the public benefit of this data. This literature review will specifically enable the Scottish Government to establish whether there are
(I) models/approaches of costs/benefits/data value/benefit-sharing, and
(II) intellectual property rights or royalties schemes regarding the use of public sector personal data with or by the private sector both in the UK and internationally.
In conducting this literature review, we used an adapted systematic review, and undertook thematic analysis of the included literature to answer several questions central to the aim of this research. Such questions included:
- Are there any models of costs and/or benefits regarding the use of public sector personal data with or by the private sector?
- Are there any models of valuing data regarding the use of public sector personal data with or by the private sector?
- Are there any models for benefit-sharing in respect of the use of public sector personal data with or by the private sector?
- Are there any models in respect of the use of intellectual property rights or royalties regarding the use of public sector personal data with or by the private sector?..(More)”.
Integrating AI into Urban Planning Workflows: Democracy Over Authoritarianism
Essay by Tyler Hinkle: “As AI tools become integrated into urban planning, a dual narrative of promise and potential pitfalls emerges. These tools offer unprecedented efficiency, creativity, and data analysis, yet if not guided by ethical considerations, they could inadvertently lead to exclusion, manipulation, and surveillance.
While AI, exemplified by tools like NovelAI, holds the potential to aggregate and synthesize public input, there’s a risk of suppressing genuine human voices in favor of algorithmic consensus. This could create a future urban landscape devoid of cultural depth and diversity, echoing historical authoritarianism.
In a potential dystopian scenario, an AI-based planning software gains access to all smart city devices, amassing data to reshape communities without consulting their residents. This data-driven transformation, devoid of human input, risks eroding the essence of community identity, autonomy, and shared decision-making. Imagine AI altering traffic flow, adjusting public transportation routes, or even redesigning public spaces based solely on data patterns, disregarding the unique needs and desires of the people who call that community home.
However, an optimistic approach guided by ethical principles can pave the way for a brighter future. Integrating AI with democratic ideals, akin to Fishkin’s deliberative democracy, can amplify citizens’ voices rather than replacing them. AI-driven deliberation can become a powerful vehicle for community engagement, transforming Arnstein’s ladder of citizen participation into a true instrument of empowerment. In addition, echoing the calls for alignment to be addresses holistically for AI, there will be alignment issues with AI as it becomes integrated into urban planning. We must take the time to ensure AI is properly aligned so it is a tool to help communities and not hurt them.
By treading carefully and embedding ethical considerations at the core, we can unleash AI’s potential to construct communities that are efficient, diverse, and resilient, while ensuring that democratic values remain paramount…(More)”.
What can harnessing ‘positive deviance’ methods do for food security?
Article by Katrina J. Lane: “What the researchers identified in Niger, in this case, is known as “positive deviance”. It’s a concept that originated in 1991 during a nutrition program in Vietnam run by Save the Children. Instead of focusing on the population level, project managers studied outliers in the system — children who were healthier than their peers despite sharing similar circumstances, and then looked at what the parents of these children did differently.
Once the beneficial practices were identified — in this case, that included collecting wild foods, such as crab, shrimp, and sweet potato tops for their children — they encouraged mothers to tell other parents. Through this outlier-centric approach, the project was able to reduce malnourishment by 74%.
“The positive deviance approach assumes that in every community there are individuals or groups that develop uncommon behaviors or practices which help them cope better with the challenges they face than their peers,” said Boy.
It’s important to be respectful and acknowledge success stories already present in systems, added Duncan Green, a strategic adviser for Oxfam and a professor in practice in international development at the London School of Economics.
Positive deviance emphasizes the benefit of identifying and amplifying these “deviant behaviors”, as they hold the potential to generate scalable solutions that can benefit the entire community.
It can be broken down into three steps: First, identifying high-performing individuals or groups within a challenging context. Next, an investigative process in the community via in-person interviews, group discussions, and questionnaires to find what their behaviors and practices are. Finally, it means encouraging solutions to be spread throughout the community.
In the final stage, the approach relies on community-generated solutions — which Green explains are more likely to propagate and be engaged with…(More)”.
Advancing Environmental Justice with AI
Article by Justina Nixon-Saintil: “Given its capacity to innovate climate solutions, the technology sector could provide the tools we need to understand, mitigate, and even reverse the damaging effects of global warming. In fact, addressing longstanding environmental injustices requires these companies to put the newest and most effective technologies into the hands of those on the front lines of the climate crisis.
Tools that harness the power of artificial intelligence, in particular, could offer unprecedented access to accurate information and prediction, enabling communities to learn from and adapt to climate challenges in real time. The IBM Sustainability Accelerator, which we launched in 2022, is at the forefront of this effort, supporting the development and scaling of projects such as the Deltares Aquality App, an AI-powered tool that helps farmers assess and improve water quality. As a result, farmers can grow crops more sustainably, prevent runoff pollution, and protect biodiversity.
Consider also the challenges that smallholder farmers face, such as rising costs, the difficulty of competing with larger producers that have better tools and technology, and, of course, the devastating effects of climate change on biodiversity and weather patterns. Accurate information, especially about soil conditions and water availability, can help them address these issues, but has historically been hard to obtain…(More)”.