Stefaan Verhulst
Paper by Alejandro Noriega-Campero, Alex Rutherford, Oren Lederman, Yves A. de Montjoye, and Alex Pentland: “Today’s age of data holds high potential to enhance the way we pursue and monitor progress in the fields of development and humanitarian action. We study the relation between data utility and privacy risk in large-scale behavioral data, focusing on mobile phone metadata as paradigmatic domain. To measure utility, we survey experts about the value of mobile phone metadata at various spatial and temporal granularity levels. To measure privacy, we propose a formal and intuitive measure of reidentification risk—the information ratio—and compute it at each granularity level. Our results confirm the existence of a stark tradeoff between data utility and reidentifiability, where the most valuable datasets are also most prone to reidentification. When data is specified at ZIP-code and hourly levels, outside knowledge of only 7% of a person’s data suffices for reidentification and retrieval of the remaining 93%. In contrast, in the least valuable dataset, specified at municipality and daily levels, reidentification requires on average outside knowledge of 51%, or 31 data points, of a person’s data to retrieve the remaining 49%. Overall, our findings show that coarsening data directly erodes its value, and highlight the need for using data-coarsening, not as stand-alone mechanism, but in combination with data-sharing models that provide adjustable degrees of accountability and security….(More)”.
The Guardian: “Fifteen leading economists, including three Nobel winners, argue that the many billions of dollars spent on aid can do little to alleviate poverty while we fail to tackle its root causes….Donors increasingly want to see more impact for their money, practitioners are searching for ways to make their projects more effective, and politicians want more financial accountability behind aid budgets. One popular option has been to audit projects for results. The argument is that assessing “aid effectiveness” – a buzzword now ubiquitous in the UK’s Department for International Development – will help decide what to focus on.
Some go so far as to insist that development interventions should be subjected to the same kind of randomised control trials used in medicine, with “treatment” groups assessed against control groups. Such trials are being rolled out to evaluate the impact of a wide variety of projects – everything from water purification tablets to microcredit schemes, financial literacy classes to teachers’ performance bonuses.
Economist Esther Duflo at MIT’s Poverty Action Lab recently argued in Le Monde that France should adopt clinical trials as a guiding principle for its aid budget, which has grown significantly under the Macron administration.
But truly random sampling with blinded subjects is almost impossible in human communities without creating scenarios so abstract as to tell us little about the real world. And trials are expensive to carry out, and fraught with ethical challenges – especially when it comes to health-related interventions. (Who gets the treatment and who doesn’t?)
But the real problem with the “aid effectiveness” craze is that it narrows our focus down to micro-interventions at a local level that yield results that can be observed in the short term. At first glance this approach might seem reasonable and even beguiling. But it tends to ignore the broader macroeconomic, political and institutional drivers of impoverishment and underdevelopment. Aid projects might yield satisfying micro-results, but they generally do little to change the systems that produce the problems in the first place. What we need instead is to tackle the real root causes of poverty, inequality and climate change….(More)”.
Book by This book analyzes e-participation in smart cities. In recent decades, information and communication technologies (ICT) have played a key role in the democratic political and governance process by allowing easier interaction between governments and citizens, and the increased ability of citizens to participate in the production chain of public services. E-participation plays and important role in the development of smart cities and smart communities , but it has not yet been extensively studied. This book fills that gap by combining empirical and theoretical research to analyze actual practices of citizen involvement in smart cities and build a solid framework for successful e-participation in smart cities.
The book is divided into three parts. Part I discusses smart technologies and their role in improving e-participation in smart cities. Part II deals with models of e-participation in smart cities and the organization issues affecting the implementation of e-participation; these chapters analyze the efficiency of governance models in relation to the establishment of smart cities. Part III proposes incentives to motivate increased participation by governments and cititzenry within the smart cities context. Written by an international panel of experts and practitioners, this book will be a convenient source of information on e-participation in smart cities and will be valuable to academics, researchers, policy-makers, public managers, citizens, international organizations and anyone who has a stake in enhancing citizen engagement in smart cities….(More)”.
Keith Kahn-Harris at The Guardian: “…Denialism is an expansion, an intensification, of denial. At root, denial and denialism are simply a subset of the many ways humans have developed to use language to deceive others and themselves. Denial can be as simple as refusing to accept that someone else is speaking truthfully. Denial can be as unfathomable as the multiple ways we avoid acknowledging our weaknesses and secret desires.
Denialism is more than just another manifestation of the humdrum intricacies of our deceptions and self-deceptions. It represents the transformation of the everyday practice of denial into a whole new way of seeing the world and – most important – a collective accomplishment. Denial is furtive and routine; denialism is combative and extraordinary. Denial hides from the truth, denialism builds a new and better truth.
In recent years, the term has been used to describe a number of fields of “scholarship”, whose scholars engage in audacious projects to hold back, against seemingly insurmountable odds, the findings of an avalanche of research. They argue that the Holocaust (and other genocides) never happened, that anthropogenic (human-caused) climate change is a myth, that Aids either does not exist or is unrelated to HIV, that evolution is a scientific impossibility, and that all manner of other scientific and historical orthodoxies must be rejected.
In some ways, denialism is a terrible term. No one calls themselves a “denialist”, and no one signs up to all forms of denialism. In fact, denialism is founded on the assertion that it is not denialism. In the wake of Freud (or at least the vulgarisation of Freud), no one wants to be accused of being “in denial”, and labelling people denialists seems to compound the insult by implying that they have taken the private sickness of denial and turned it into public dogma.
But denial and denialism are closely linked; what humans do on a large scale is rooted in what we do on a small scale. While everyday denial can be harmful, it is also just a mundane way for humans to respond to the incredibly difficult challenge of living in a social world in which people lie, make mistakes and have desires that cannot be openly acknowledged. Denialism is rooted in human tendencies that are neither freakish nor pathological.
All that said, there is no doubt that denialism is dangerous. In some cases, we can point to concrete examples of denialism causing actual harm. In South Africa, President Thabo Mbeki, in office between 1999 and 2008, was influenced by Aids denialists such as Peter Duesberg, who deny the link between HIV and Aids (or even HIV’s existence) and cast doubt on the effectiveness of anti-retroviral drugs. Mbeki’s reluctance to implement national treatment programmes using anti-retrovirals has been estimated to have cost the lives of 330,000 people. On a smaller scale, in early 2017 the Somali-American community in Minnesota was struck by a childhood measles outbreak, as a direct result of proponents of the discredited theory that the MMR vaccine causes autism, persuading parents not to vaccinate their children….(More)”.
Eleni Manis at RealClearHealth: “Americans are willing to share personal data — even sensitive medical data — to advance the common good. A recent Stanford University study found that 93 percent of medical trial participants in the United States are willing to share their medical data with university scientists and 82 percent are willing to share with scientists at for-profit companies. In contrast, less than a third are concerned that their data might be stolen or used for marketing purposes.
However, the majority of regulations surrounding medical data focus on individuals’ ability to restrict the use of their medical data, with scant attention paid to supporting the ability to share personal data for the common good. Policymakers can begin to right this balance by establishing a national medical data donor registry that lets individuals contribute their medical data to support research after their deaths. Doing so would help medical researchers pursue cures and improve health care outcomes for all Americans.
Increased medical data sharing facilitates advances in medical science in three key ways. First, de-identified participant-level data can be used to understand the results of trials, enabling researchers to better explicate the relationship between treatments and outcomes. Second, researchers can use shared data to verify studies and identify cases of data fraud and research misconduct in the medical community. For example, one researcher recently discovered a prolific Japanese anesthesiologist had falsified data for almost two decades. Third, shared data can be combined and supplemented to support new studies and discoveries.
Despite these benefits, researchers, research funders, and regulators have struggled to establish a norm for sharing clinical research data. In some cases, regulatory obstacles are to blame. HIPAA — the federal law regulating medical data — blocks some sharing on grounds of patient privacy, while federal and state regulations governing data sharing are inconsistent. Researchers themselves have a proprietary interest in data they produce, while academic researchers seeking to maximize publications may guard data jealously.
Though funding bodies are aware of this tension, they are unable to resolve it on their own. The National Institutes of Health, for example, requires a data sharing plan for big-ticket funding but recognizes that proprietary interests may make sharing impossible….(More)”.
Data science techniques allow us to use methods like network science and machine learning to uncover patterns and insights that are hard for humans to see. Just as we can map influential users on Twitter — and patterns of relations between places to predict how diseases will spread — we can identify which countries have influenced each other in the past and what are the relations between legal provisions.
![]()
One way UNICEF fulfills its mission is through advocacy with national governments — to enshrine rights for minorities, notably children, formally in law. Perhaps the most renowned example of this is the International Convention on the Rights of the Child (ICRC).
Constitutions, such as Mexico’s 1917 constitution — the first to limit the employment of children — are critical to formalizing rights for vulnerable populations. National constitutions describe the role of a country’s institutions, its character in the eyes of the world, as well as the rights of its citizens.
From a scientific standpoint, the work is an important first step in showing that network analysis and machine learning technique can be used to better understand the dynamics of caring for and protecting the rights of children — critical to the work we do in a complex and interconnected world. It shows the significant, and positive policy implications of using data science to uphold children’s rights.
What the Research Shows:Through this research, we uncovered:
- A network of relationships between countries and their constitutions.
- A natural progression of laws — where fundamental rights are a necessary precursor to more specific rights for minorities.
- The effect of key historical events in changing legal norms….(More)”.
Book by Suzanne Mettler: “Americans’ relationship to the federal government is paradoxical. Polls show that public opinion regarding the government has plummeted to all-time lows, with only one in five saying they trust the government or believe that it operates in their interest. Yet, at the same time, more Americans than ever benefit from some form of government social provision. Political scientist Suzanne Mettler calls this growing gulf between people’s perceptions of government and the actual role it plays in their lives the “government-citizen disconnect.” In The Government-Citizen Disconnect, she explores the rise of this phenomenon and its implications for policymaking and politics.
Drawing from original survey data which probed Americans’ experiences of 21 federal social policies — such as food stamps, Social Security, Medicaid, and the home mortgage interest deduction — Mettler shows that 96 percent of adults have received benefits from at least one of them, and that the average person has utilized five. Overall usage rates transcend social, economic, and political divisions, and most Americans report positive experiences of their policy experiences. However, the fact that they have benefited from these policies bears little positive effect on people’s attitudes towards government. Mettler finds that shared identities and group affiliations are more powerful and consistent influences. In particular, those who oppose welfare tend to extrapolate their unfavorable views of it to government in general. Deep antipathy toward the government has emerged as a conservative movement waged a war on social welfare policies for over forty years, even as economic inequality and benefit use increased.
Mettler finds that patterns of political participation exacerbate the government-citizen disconnect, as those holding positive views of federal programs and supporting expanded benefits have lower rates of involvement than those holding more hostile views of the government. As a result, the loudest political voice belongs to those who have benefited from policies but who give government little credit for their economic well-being, seeing their success more as a matter of their own deservingness. This contributes to the election of politicians who advocate cutting federal social programs. According to Mettler, the government-citizen disconnect frays the bonds of representative government and democracy.
The Government-Citizen Disconnect illuminates a paradox that increasingly shapes American politics. Mettler’s examination of hostility toward government at a time when most Americans will at some point rely on the social benefits it provides helps us better understand the roots of today’s fractious political climate….(More)”
Cordis: “Estimating poverty is crucial for improving policymaking and advancing the sustainability of a society. Traditional poverty estimation methods such as household surveys and census data incur huge costs however, creating a need for more efficient approaches.
With this in mind, the EU-funded USES project examined how satellite images could be used to estimate household-level poverty in rural regions of developing countries. “This promises to be a radically more cost-effective way of monitoring and evaluating the Sustainable Development Goals,” says Dr Gary Watmough, USES collaborator and Interdisciplinary Lecturer in Land Use and Socioecological Systems at the University of Edinburgh, United Kingdom.
Land use and land cover reveal poverty clues
To achieve its aims, the project investigated how land use and land cover information from satellite data could be linked with household survey data. “We looked particularly at how households use the landscape in the local area for agriculture and other purposes such as collecting firewood and using open areas for grazing cattle,” explains Dr Watmough.
The work also involved examining satellite images to determine which types of land use were related to household wealth or poverty using statistical analysis. “By trying to predict household poverty using the land use data we could see which land use variables were most related to the household wealth in the area,” adds Dr Watmough.
Overall, the USES project found that satellite data could predict poverty particularly the poorest households in the area. Dr Watmough comments: “This is quite remarkable given that we are trying to predict complicated household-level poverty from a simple land use map derived from high-resolution satellite data.”
A study conducted by USES in Kenya found that the most important remotely sensed variable was building size within the homestead. Buildings less than 140 m2 were mostly associated with poorer households, whereas those over 140 m2 tended to be wealthier. The amount of bare ground in agricultural fields and within the homestead region was also important. “We also found that poorer households were associated with a shorter number of agricultural growing days,” says Dr Watmough….(More)”.
Jonathan Cornelissen at Harvard Business School: “Want to catch tax cheats? The government of Rwanda does — and it’s finding them by studying anomalies in revenue-collection data.
Want to understand how American culture is changing? So does a budding sociologist in Indiana. He’s using data science to find patterns in the massive amounts of text people use each day to express their worldviews — patterns that no individual reader would be able to recognize.
Intelligent people find new uses for data science every day. Still, despite the explosion of interest in the data collected by just about every sector of American business — from financial companies and health care firms to management consultancies and the government — many organizations continue to relegate data-science knowledge to a small number of employees.
That’s a mistake — and in the long run, it’s unsustainable. Think of it this way: Very few companies expect only professional writers to know how to write. So why ask onlyprofessional data scientists to understand and analyze data, at least at a basic level?
Relegating all data knowledge to a handful of people within a company is problematic on many levels. Data scientists find it frustrating because it’s hard for them to communicate their findings to colleagues who lack basic data literacy. Business stakeholders are unhappy because data requests take too long to fulfill and often fail to answer the original questions. In some cases, that’s because the questioner failed to explain the question properly to the data scientist.
Why would non–data scientists need to learn data science? That’s like asking why non-accountants should be expected to stay within budget.
These days every industry is drenched in data, and the organizations that succeed are those that most quickly make sense of their data in order to adapt to what’s coming. The best way to enable fast discovery and deeper insights is to disperse data science expertise across an organization.
Companies that want to compete in the age of data need to do three things: share data tools, spread data skills, and spread data responsibility…(More)”.