E-Participation in Smart Cities: Technologies and Models of Governance for Citizen Engagement


Book by Manuel Pedro Rodríguez Bolívar and Laura Alcaide Muñoz: “This book analyzes e-participation in smart cities.  In recent decades, information and communication technologies (ICT) have played a key role in the democratic political and governance process by allowing easier interaction between governments and citizens, and the increased ability of citizens to participate in the production chain of public services.  E-participation plays and important role in the development of smart cities and smart communities , but it has not yet been extensively studied.  This book fills that gap by combining empirical and theoretical research to analyze actual practices of citizen involvement in smart cities and build a solid framework for successful e-participation in smart cities.

The book is divided into three parts.  Part I discusses smart technologies and their role in improving e-participation in smart cities.  Part II deals with models of e-participation in smart cities and the organization issues affecting the implementation of e-participation; these chapters analyze the efficiency of governance models in relation to the establishment of smart cities.  Part III proposes incentives to motivate increased participation by governments and cititzenry within the smart cities context.  Written by an international panel of experts and practitioners, this book will be a convenient source of information on e-participation in smart cities and will be valuable to academics, researchers, policy-makers, public managers, citizens, international organizations and anyone who has a stake in enhancing citizen engagement in smart cities….(More)”.

Denialism: what drives people to reject the truth


Keith Kahn-Harris at The Guardian: “…Denialism is an expansion, an intensification, of denial. At root, denial and denialism are simply a subset of the many ways humans have developed to use language to deceive others and themselves. Denial can be as simple as refusing to accept that someone else is speaking truthfully. Denial can be as unfathomable as the multiple ways we avoid acknowledging our weaknesses and secret desires.

Denialism is more than just another manifestation of the humdrum intricacies of our deceptions and self-deceptions. It represents the transformation of the everyday practice of denial into a whole new way of seeing the world and – most important – a collective accomplishment. Denial is furtive and routine; denialism is combative and extraordinary. Denial hides from the truth, denialism builds a new and better truth.

In recent years, the term has been used to describe a number of fields of “scholarship”, whose scholars engage in audacious projects to hold back, against seemingly insurmountable odds, the findings of an avalanche of research. They argue that the Holocaust (and other genocides) never happened, that anthropogenic (human-caused) climate change is a myth, that Aids either does not exist or is unrelated to HIV, that evolution is a scientific impossibility, and that all manner of other scientific and historical orthodoxies must be rejected.

In some ways, denialism is a terrible term. No one calls themselves a “denialist”, and no one signs up to all forms of denialism. In fact, denialism is founded on the assertion that it is not denialism. In the wake of Freud (or at least the vulgarisation of Freud), no one wants to be accused of being “in denial”, and labelling people denialists seems to compound the insult by implying that they have taken the private sickness of denial and turned it into public dogma.

But denial and denialism are closely linked; what humans do on a large scale is rooted in what we do on a small scale. While everyday denial can be harmful, it is also just a mundane way for humans to respond to the incredibly difficult challenge of living in a social world in which people lie, make mistakes and have desires that cannot be openly acknowledged. Denialism is rooted in human tendencies that are neither freakish nor pathological.

All that said, there is no doubt that denialism is dangerous. In some cases, we can point to concrete examples of denialism causing actual harm. In South Africa, President Thabo Mbeki, in office between 1999 and 2008, was influenced by Aids denialists such as Peter Duesberg, who deny the link between HIV and Aids (or even HIV’s existence) and cast doubt on the effectiveness of anti-retroviral drugs. Mbeki’s reluctance to implement national treatment programmes using anti-retrovirals has been estimated to have cost the lives of 330,000 people. On a smaller scale, in early 2017 the Somali-American community in Minnesota was struck by a childhood measles outbreak, as a direct result of proponents of the discredited theory that the MMR vaccine causes autism, persuading parents not to vaccinate their children….(More)”.

Americans Want to Share Their Medical Data. So Why Can’t They?


Eleni Manis at RealClearHealth: “Americans are willing to share personal data — even sensitive medical data — to advance the common good. A recent Stanford University study found that 93 percent of medical trial participants in the United States are willing to share their medical data with university scientists and 82 percent are willing to share with scientists at for-profit companies. In contrast, less than a third are concerned that their data might be stolen or used for marketing purposes.

However, the majority of regulations surrounding medical data focus on individuals’ ability to restrict the use of their medical data, with scant attention paid to supporting the ability to share personal data for the common good. Policymakers can begin to right this balance by establishing a national medical data donor registry that lets individuals contribute their medical data to support research after their deaths. Doing so would help medical researchers pursue cures and improve health care outcomes for all Americans.

Increased medical data sharing facilitates advances in medical science in three key ways. First, de-identified participant-level data can be used to understand the results of trials, enabling researchers to better explicate the relationship between treatments and outcomes. Second, researchers can use shared data to verify studies and identify cases of data fraud and research misconduct in the medical community. For example, one researcher recently discovered a prolific Japanese anesthesiologist had falsified data for almost two decades. Third, shared data can be combined and supplemented to support new studies and discoveries.

Despite these benefits, researchers, research funders, and regulators have struggled to establish a norm for sharing clinical research data. In some cases, regulatory obstacles are to blame. HIPAA — the federal law regulating medical data — blocks some sharing on grounds of patient privacy, while federal and state regulations governing data sharing are inconsistent. Researchers themselves have a proprietary interest in data they produce, while academic researchers seeking to maximize publications may guard data jealously.

Though funding bodies are aware of this tension, they are unable to resolve it on their own. The National Institutes of Health, for example, requires a data sharing plan for big-ticket funding but recognizes that proprietary interests may make sharing impossible….(More)”.

#TrendingLaws: How can Machine Learning and Network Analysis help us identify the “influencers” of Constitutions?


Unicef: “New research by scientists from UNICEF’s Office of Innovation — published today in the journal Nature Human Behaviour — applies methods from network science and machine learning to constitutional law.  UNICEF Innovation Data Scientists Alex Rutherford and Manuel Garcia-Herranz collaborated with computer scientists and political scientists at MIT, George Washington University, and UC Merced to apply data analysis to the world’s constitutions over the last 300 years. This work sheds new light on how to better understand why countries’ laws change and incorporate social rights…

Data science techniques allow us to use methods like network science and machine learning to uncover patterns and insights that are hard for humans to see. Just as we can map influential users on Twitter — and patterns of relations between places to predict how diseases will spread — we can identify which countries have influenced each other in the past and what are the relations between legal provisions.

Why The Science of Constitutions?

One way UNICEF fulfills its mission is through advocacy with national governments — to enshrine rights for minorities, notably children, formally in law. Perhaps the most renowned example of this is the International Convention on the Rights of the Child (ICRC).

Constitutions, such as Mexico’s 1917 constitution — the first to limit the employment of children — are critical to formalizing rights for vulnerable populations. National constitutions describe the role of a country’s institutions, its character in the eyes of the world, as well as the rights of its citizens.

From a scientific standpoint, the work is an important first step in showing that network analysis and machine learning technique can be used to better understand the dynamics of caring for and protecting the rights of children — critical to the work we do in a complex and interconnected world. It shows the significant, and positive policy implications of using data science to uphold children’s rights.

What the Research Shows:

Through this research, we uncovered:

  • A network of relationships between countries and their constitutions.
  • A natural progression of laws — where fundamental rights are a necessary precursor to more specific rights for minorities.
  • The effect of key historical events in changing legal norms….(More)”.

The Government-Citizen Disconnect


Book by Suzanne Mettler: “Americans’ relationship to the federal government is paradoxical. Polls show that public opinion regarding the government has plummeted to all-time lows, with only one in five saying they trust the government or believe that it operates in their interest. Yet, at the same time, more Americans than ever benefit from some form of government social provision. Political scientist Suzanne Mettler calls this growing gulf between people’s perceptions of government and the actual role it plays in their lives the “government-citizen disconnect.” In The Government-Citizen Disconnect, she explores the rise of this phenomenon and its implications for policymaking and politics.

Drawing from original survey data which probed Americans’ experiences of 21 federal social policies — such as food stamps, Social Security, Medicaid, and the home mortgage interest deduction — Mettler shows that 96 percent of adults have received benefits from at least one of them, and that the average person has utilized five. Overall usage rates transcend social, economic, and political divisions, and most Americans report positive experiences of their policy experiences. However, the fact that they have benefited from these policies bears little positive effect on people’s attitudes towards government. Mettler finds that shared identities and group affiliations are more powerful and consistent influences. In particular, those who oppose welfare tend to extrapolate their unfavorable views of it to government in general. Deep antipathy toward the government has emerged as a conservative movement waged a war on social welfare policies for over forty years, even as economic inequality and benefit use increased.

Mettler finds that patterns of political participation exacerbate the government-citizen disconnect, as those holding positive views of federal programs and supporting expanded benefits have lower rates of involvement than those holding more hostile views of the government. As a result, the loudest political voice belongs to those who have benefited from policies but who give government little credit for their economic well-being, seeing their success more as a matter of their own deservingness. This contributes to the election of politicians who advocate cutting federal social programs. According to Mettler, the government-citizen disconnect frays the bonds of representative government and democracy.

The Government-Citizen Disconnect illuminates a paradox that increasingly shapes American politics. Mettler’s examination of hostility toward government at a time when most Americans will at some point rely on the social benefits it provides helps us better understand the roots of today’s fractious political climate….(More)”

Satellites can advance sustainable development by highlighting poverty


Cordis: “Estimating poverty is crucial for improving policymaking and advancing the sustainability of a society. Traditional poverty estimation methods such as household surveys and census data incur huge costs however, creating a need for more efficient approaches.

With this in mind, the EU-funded USES project examined how satellite images could be used to estimate household-level poverty in rural regions of developing countries. “This promises to be a radically more cost-effective way of monitoring and evaluating the Sustainable Development Goals,” says Dr Gary Watmough, USES collaborator and Interdisciplinary Lecturer in Land Use and Socioecological Systems at the University of Edinburgh, United Kingdom.

Land use and land cover reveal poverty clues

To achieve its aims, the project investigated how land use and land cover information from satellite data could be linked with household survey data. “We looked particularly at how households use the landscape in the local area for agriculture and other purposes such as collecting firewood and using open areas for grazing cattle,” explains Dr Watmough.

The work also involved examining satellite images to determine which types of land use were related to household wealth or poverty using statistical analysis. “By trying to predict household poverty using the land use data we could see which land use variables were most related to the household wealth in the area,” adds Dr Watmough.

Overall, the USES project found that satellite data could predict poverty particularly the poorest households in the area. Dr Watmough comments: “This is quite remarkable given that we are trying to predict complicated household-level poverty from a simple land use map derived from high-resolution satellite data.”

A study conducted by USES in Kenya found that the most important remotely sensed variable was building size within the homestead. Buildings less than 140 m2 were mostly associated with poorer households, whereas those over 140 m2 tended to be wealthier. The amount of bare ground in agricultural fields and within the homestead region was also important. “We also found that poorer households were associated with a shorter number of agricultural growing days,” says Dr Watmough….(More)”.

The Democratization of Data Science


Jonathan Cornelissen at Harvard Business School: “Want to catch tax cheats? The government of Rwanda does — and it’s finding them by studying anomalies in revenue-collection data.

Want to understand how American culture is changing? So does a budding sociologist in Indiana. He’s using data science to find patterns in the massive amounts of text people use each day to express their worldviews — patterns that no individual reader would be able to recognize.

Intelligent people find new uses for data science every day. Still, despite the explosion of interest in the data collected by just about every sector of American business — from financial companies and health care firms to management consultancies and the government — many organizations continue to relegate data-science knowledge to a small number of employees.

That’s a mistake — and in the long run, it’s unsustainable. Think of it this way: Very few companies expect only professional writers to know how to write. So why ask onlyprofessional data scientists to understand and analyze data, at least at a basic level?

Relegating all data knowledge to a handful of people within a company is problematic on many levels. Data scientists find it frustrating because it’s hard for them to communicate their findings to colleagues who lack basic data literacy. Business stakeholders are unhappy because data requests take too long to fulfill and often fail to answer the original questions. In some cases, that’s because the questioner failed to explain the question properly to the data scientist.

Why would non–data scientists need to learn data science? That’s like asking why non-accountants should be expected to stay within budget.

These days every industry is drenched in data, and the organizations that succeed are those that most quickly make sense of their data in order to adapt to what’s coming. The best way to enable fast discovery and deeper insights is to disperse data science expertise across an organization.

Companies that want to compete in the age of data need to do three things: share data tools, spread data skills, and spread data responsibility…(More)”.

The Political Value of Time: Citizenship, Duration, and Democratic Justice


Book by Elizabeth F. Cohen: “Waiting periods and deadlines are so ubiquitous that we often take them for granted. Yet they form a critical part of any democratic architecture. When a precise moment or amount of time is given political importance, we ought to understand why this is so. The Political Value of Time explores the idea of time within democratic theory and practice. Elizabeth F. Cohen demonstrates how political procedures use quantities of time to confer and deny citizenship rights. Using specific dates and deadlines, states carve boundaries around a citizenry. As time is assigned a form of political value it comes to be used to transact over rights. Cohen concludes with a normative analysis of the ways in which the devaluation of some people’s political time constitutes a widely overlooked form of injustice. This book shows readers how and why they need to think about time if they want to understand politics….(More)“.

Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject


Nick Couldry and Ulises Mejias in Television & New Media (TVNM): “...Data colonialism combines the predatory extractive practices of historical colonialism with the abstract quantification methods of computing. Understanding Big Data from the Global South means understanding capitalism’s current dependence on this new type of appropriation that works at every point in space where people or things are attached to today’s infrastructures of connection. The scale of this transformation means that it is premature to map the forms of capitalism that will emerge from it on a global scale. Just as historical colonialism over the long-run provided the essential preconditions for the emergence of industrial capitalism, so over time, we can expect that data colonialism will provide the preconditions for a new stage of capitalism that as yet we can barely imagine, but for which the appropriation of human life through data will be central.

Right now, the priority is not to speculate about that eventual stage of capitalism, but to resist the data colonialism that is under way. This is how we understand Big Data from the South. Through what we call ‘data relations’ (new types of human relations which enable the extraction of data for commodification), social life all over the globe becomes an ‘open’ resource for extraction that is somehow ‘just there’ for capital. These global flows of data are as expansive as historic colonialism’s appropriation of land, resources, and bodies, although the epicentre has somewhat shifted. Data colonialism involves not one pole of colonial power (‘the West’), but at least two: the USA and China. This complicates our notion of the geography of the Global South, a concept which until now helped situate resistance and disidentification along geographic divisions between former colonizers and colonized. Instead, the new data colonialism works both externally — on a global scale — and internally on its own home populations. The elites of data colonialism (think of Facebook) benefit from colonization in both dimensions, and North-South, East-West divisions no longer matter in the same way.

It is important to acknowledge both the apparent similarities and the significant differences between our argument and the many preceding critical arguments about Big Data…(More)”

A rationale for data governance as an approach to tackle recurrent drawbacks in open data portals


Conference paper by Juan Ribeiro Reis et al: “Citizens and developers are gaining broad access to public data sources, made available in open data portals. These machine-readable datasets enable the creation of applications that help the population in several ways, giving them the opportunity to actively participate in governance processes, such as decision taking and policy-making.

While the number of open data portals grows over the years, researchers have been able to identify recurrent problems with the data they provide, such as lack of data standards, difficulty in data access and poor understandability. Such issues make difficult the effective use of data. Several works in literature propose different approaches to mitigate these issues, based on novel or well-known data management techniques.

However, there is a lack of general frameworks for tackling these problems. On the other hand, data governance has been applied in large companies to manage data problems, ensuring that data meets business needs and become organizational assets. In this paper, firstly, we highlight the main drawbacks pointed out in literature for government open data portals. Eventually, we bring around how data governance can tackle much of the issues identified…(More)”.