New Data Portal to analyze governance in Africa


Africa’s health won’t improve without reliable data and collaboration


 and  at the Conversation: “…Africa has a data problem. This is true in many sectors. When it comes to health there’s both a lack of basic population data about disease and an absence of information about what impact, if any, interventions involving social determinants of health – housing, nutrition and the like – are having.

Simply put, researchers often don’t know who is sick or what people are being exposed to that, if addressed, could prevent disease and improve health. They cannot say if poor sanitation is the biggest culprit, or if substandard housing in a particular region is to blame. They don’t have the data that explains which populations are most vulnerable.

These data are required to inform development of innovative interventions that apply a “Health in All Policies” approach to address social determinants of health and improve health equity.

To address this, health data need to be integrated with social determinant data about areas like food, housing, and physical activity or mobility. Even where population data are available, they are not always reliable. There’s often an issue of compatability: different sectors collect different kinds of information using varying methodologies.

Different sectors also use different indicators to collect information on the same social determinant of health. This makes data integration challenging.

Without clear, focused, reliable data it’s difficult to understand what a society’s problems are and what specific solutions – which may lie outside the health sector – might be suitable for that unique context.

Scaling up innovations

Some remarkable work is being done to tackle Africa’s health problems. This ranges from technological innovations to harnessing indigenous knowledge for change. Both approaches are vital. But it’s hard for these to be scaled up either in terms of numbers or reach.

This boils down to a lack of funding or a lack of access to funding. Too many potentially excellent projects remain stuck at the pilot phase, which has limited value for ordinary people…..

Governments need to develop health equity surveillance systems to overcome the current lack of data. It’s also crucial that governments integrate and monitor health and social determinants of health indicators in one central system. This would provide a better understanding of health inequity in a given context.

For this to happen, governments must work with public and private sector stakeholders and nongovernmental organisations – not just in health, but beyond it so that social determinants of health can be better measured and captured.

The data that already exists at sub-national, national, regional and continental level mustn’t just be brushed aside. It should be archived and digitised so that it isn’t lost.

Researchers have a role to play here. They have to harmonise and be innovative in the methodologies they use for data collection. If researchers can work together across the breadth of sectors and disciplines that influence health, important information won’t slip through the cracks.

When it comes to scaling up innovation, governments need to step up to the plate. It’s crucial that they support successful health innovations, whether these are rooted in indigenous knowledge or are new technologies. And since – as we’ve already shown – health issues aren’t the exclusive preserve of the health sector, governments should look to different sectors and innovative partnerships to generate support and funding….(More)”

The ethical impact of data science


Theme issue of Phil. Trans. R. Soc. A compiled and edited by Mariarosaria Taddeo and Luciano Floridi: “This theme issue has the founding ambition of landscaping data ethics as a new branch of ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values). Data ethics builds on the foundation provided by computer and information ethics but, at the same time, it refines the approach endorsed so far in this research field, by shifting the level of abstraction of ethical enquiries, from being information-centric to being data-centric. This shift brings into focus the different moral dimensions of all kinds of data, even data that never translate directly into information but can be used to support actions or generate behaviours, for example. It highlights the need for ethical analyses to concentrate on the content and nature of computational operations—the interactions among hardware, software and data—rather than on the variety of digital technologies that enable them. And it emphasizes the complexity of the ethical challenges posed by data science. Because of such complexity, data ethics should be developed from the start as a macroethics, that is, as an overall framework that avoids narrow, ad hoc approaches and addresses the ethical impact and implications of data science and its applications within a consistent, holistic and inclusive framework. Only as a macroethics will data ethics provide solutions that can maximize the value of data science for our societies, for all of us and for our environments….(More)”

Table of Contents:

  • The dynamics of big data and human rights: the case of scientific research; Effy Vayena, John Tasioulas
  • Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue; Sebastian Porsdam Mann, Julian Savulescu, Barbara J. Sahakian
  • Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions; Luciano Floridi
  • Compelling truth: legal protection of the infosphere against big data spills; Burkhard Schafer
  • Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems; Sabina Leonelli
  • Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy; Deirdre K. Mulligan, Colin Koopman, Nick Doty
  • Beyond privacy and exposure: ethical issues within citizen-facing analytics; Peter Grindrod
  • The ethics of smart cities and urban science; Rob Kitchin
  • The ethics of big data as a public good: which public? Whose good? Linnet Taylor
  • Data philanthropy and the design of the infraethics for information societies; Mariarosaria Taddeo
  • The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics; Olivia Varley-Winter, Hetan Shah
  • Data science ethics in government; Cat Drew
  • The ethics of data and of data science: an economist’s perspective; Jonathan Cave
  • What’s the good of a science platform? John Gallacher

 

Between Governance of the Past and Technology of the Future


Think Piece by Heather Grabbe for ESPAS 2016 conference: ” In many parts of everyday life, voters are used to a consumer experience where they get instant feedback and personal participation; but party membership, ballot boxes and stump speeches do not offer the same speed, control or personal engagement. The institutions of representative democracy at national and EU level — political parties, elected members, law-making — do not offer the same quality of experience for their ultimate consumers.

This matters because it is causing voters to switch off. Broad participation by most of the population in the practice of democracy is vital for societies to remain open because it ensures pluralism and prevents takeover of power by narrow interests. But in some countries and some elections, turnout is regularly below a third of registered voters, especially in European Parliament elections.

The internet is driving the major trends that create this disconnection and disruption. Here are four vital areas in which politics should adapt, including at EU level:

  • Expectation. Voters have a growing sense that political parties and law-making are out of touch, but not that politics is irrelevant. …
  • Affiliation. … people are interested in new forms of affiliation, especially through social media and alternative networks. …
  • Location. Digital technology allows people to find myriad new ways to express their political views publicly, outside of formal political spaces. …
  • Information. The internet has made vast amounts of data and a huge range of information sources across an enormous spectrum of issues available to every human with an internet connection. How is this information overload affecting engagement with politics? ….(More)”

Government for a Digital Economy


Chapter by Zoe Baird in America’s National Security Architecture: Rebuilding the Foundation: “The private sector is transforming at record speed for the digital economy. As recently as 2008, when America elected President Obama, most large companies had separate IT departments, which were seen as just that—departments—separate from the heart of the business. Now, as wireless networks connect the planet, and entire companies exist in the cloud, digital technology is no longer viewed as another arrow in the corporate quiver, but rather the very foundation upon which all functions are built. This, then, is the mark of the digital era: in order to remain successful, modern enterprises must both leverage digital technology and develop a culture that values its significance within the organization.

For the federal government to help all Americans thrive in this new economy, and for the government to be an engine of growth, it too must enter the digital era. On a basic level, we need to improve the government’s digital infrastructure and use technology to deliver government services better. But a government for the digital economy needs to take bold steps to embed these actions as part of a large and comprehensive transformation in how it goes about the business of governing. We should not only call on the “IT department” to provide tools, we must completely change the way we think about how a digital age government learns about the world, makes policy, and operates against its objectives.

Government today does not reflect the fundamental attributes of the digital age. It moves slowly at a time when information travels around the globe at literally the speed of light. It takes many years to develop and implement comprehensive policy in a world characterized increasingly by experimentation and iterative midcourse adjustments. It remains departmentally balkanized and hierarchical in an era of networks and collaborative problem solving. It assumes that it possesses the expertise necessary to make decisions while most of the knowledge resides at the edges. It is bogged down in legacy structures and policy regimes that do not take advantage of digital tools, and worse, create unnecessary barriers that hold progress back. Moreover, it is viewed by its citizens as opaque and complex in an era when openness and access are attributes of legitimacy….(More)”

Make Democracy Great Again: Let’s Try Some ‘Design Thinking’


Ken Carbone in the Huffington Post: “Allow me to begin with the truth. I’ve never studied political science, run for public office nor held a position in government. For the last forty years I’ve led a design agency working with enduring brands across the globe. As with any experienced person in my profession, I have used research, deductive reasoning, logic and “design thinking“ to solve complex problems and create opportunities. Great brands that are showing their age turn to our agency to get back on course. In this light, I believe American democracy is a prime target for some retooling….

The present campaign cycle has left many voters wondering how such divisiveness and national embarrassment could be happening in the land of the free and home of the brave. This could be viewed as symptomatic of deeper structural problems in our tradition bound 240 year-old democracy. Great brands operate on a “innovate or die” model to insure success. The continual improvement of how a business operates and adapts to market conditions is a sound and critical practice.

Although the current election frenzy will soon be over, I want to examine three challenges to our election process and propose possible solutions for consideration. I’ll use the same diagnostic thinking I use with major corporations:

Term Limits…

Voting and Voter registration…

Political Campaigns…

In June of this year I attended the annual leadership conference of AIGA, the professional association for design, in Raleigh NC. A provocative question posed to a select group of designers was “What would you do if you were Secretary of Design.” The responses addressed issues concerning positive social change, education and Veteran Affairs. The audience was full of several hundred trained professionals whose everyday problem solving methods encourage divergent thinking to explore many solutions (possible or impossible) and then use convergent thinking to select and realize the best resolution. This is the very definition of “design thinking.” That leads to progress….(More)”.

Digital Government: Leveraging Innovation to Improve Public Sector Performance and Outcomes for Citizens


Book edited by Svenja Falk, Andrea Römmele, Andrea and Michael Silverman: “This book focuses on the implementation of digital strategies in the public sectors in the US, Mexico, Brazil, India and Germany. The case studies presented examine different digital projects by looking at their impact as well as their alignment with their national governments’ digital strategies. The contributors assess the current state of digital government, analyze the contribution of digital technologies in achieving outcomes for citizens, discuss ways to measure digitalization and address the question of how governments oversee the legal and regulatory obligations of information technology. The book argues that most countries formulate good strategies for digital government, but do not effectively prescribe and implement corresponding policies and programs. Showing specific programs that deliver results can help policy makers, knowledge specialists and public-sector researchers to develop best practices for future national strategies….(More)”

Crowd-sourcing pollution control in India


Springwise: “Following orders by the national government to improve the air quality of the New Delhi region by reducing air pollution, the Environment Pollution (Prevention and Control) Authority created the Hawa Badlo app. Designed for citizens to report cases of air pollution, each complaint is sent to the appropriate official for resolution.

Free to use, the app is available for both iOS and Android. Complaints are geo-tagged, and there are two different versions available – one for citizens and one for government officials. Officials must provide photographic evidence to close a case. The app itself produces weekly reports listings the numbers and status of complaints, along with any actions taken to resolve the problem. Currently focusing on pollution from construction, unpaved roads and the burning of garbage, the team behind the app plans to expand its use to cover other types of pollution as well.

From providing free wi-fi when the air is clean enough to mapping air-quality in real-time, air pollution solutions are increasingly involving citizens….(More)”

Open data aims to boost food security prospects


Mark Kinver at BBC News: “Rothamsted Research, a leading agricultural research institution, is attempting to make data from long-term experiments available to all.

In partnership with a data consultancy, is it developing a method to make complex results accessible and useable.

The institution is a member of the Godan Initiative that aims to make data available to the scientific community.

In September, Godan called on the public to sign its global petition to open agricultural research data.

“The continuing challenge we face is that the raw data alone is not sufficient enough on its own for people to make sense of it,” said Chris Rawlings, head of computational and systems biology at Rothamsted Research.

“This is because the long-term experiments are very complex, and they are looking at agriculture and agricultural ecosystems so you need to know a lot of about what the intention of the studies are, how they are being used, and the changes that have taken place over time.”

However, he added: “Even with this level of complexity, we do see significant number of users contacting us or developing links with us.”

One size fits all

The ability to provide open data to all is one of the research organisation’s national capabilities, and forms a defining principle of its web portal to the experiments carried out at its North Wyke Farm Platform in North Devon.

Rothamsted worked in partnership with Tessella, a data consultancy, on the data collected from the experiments, which focused on livestock pastures.

The information being collected, as often as every 15 minutes, includes water run-off levels, soil moisture, meteorological data, and soil nutrients, and this is expected to run for decades.

“The data is quite varied and quite diverse, and [Rothamsted] wants to make to make this data available to the wider research community,” explained Tessella’s Andrew Bowen.

“What Rothamsted needed was a way to store it and a way to present it in a portal in which people could see what they had to offer.”

He told BBC News that there were a number of challenges that needed to be tackled.

One was the management of the data, and the team from Tessella adopted an “agile scrum” approach.

“Basically, what you do is draw up a list of the requirements, of what you need, and we break the project down into short iterations, starting with the highest priority,” he said.

“This means that you are able to take a more exploratory approach to the process of developing software. This is very well suited to the research environment.”…(More)”

Self-organised scientific crowds to remedy research bureaucracy


 at EuroScientist: “Imagine a world without peer review committees, project proposals or activity reports. Imagine a world where research funds seamlessly flow where they are best employed, like nutrients in a food-web or materials in a river network. Many scientists would immediately signup to live in such a world.

The Netherlands is set to become the place where this academic paradise will be tested, in the next few years. In July 2016, the Dutch parliament approved a motion related to implementing alternative funding procedures to alleviate the research bureaucracy, which is increasingly burdening scientists. Here EuroScientistinvestigates whether the self-organisation power of the scientific community could help resolve one of researchers’ worse burden.

Self-organisation

The Dutch national funding agency is planning to adopt a radically new system to allocate part of its funding, promoted by ecologist Marten Sheffer, who is professor of aquatic ecology and water quality management at Wageningen University and Research Centre. Under the proposed approach, funds would intially be evenly divided among all scientists in the country. Then, they would each have to allocate half of what they have received to the person who, in their opinion, is the most deserving scientist in their network. Then, the process would be iterated.

The promoters of the system believe that the “wisdom of the crowd” of the scientific community would assigning more funds to the most deserving scientists among them; with minimal amount of paperwork. The Dutch initiative is part of a broader effort to use a scientific approach to improve science.

In other words, it is part of a trend aiming to employ scientific evidence to tweak the social mechanisms of academia. Specifically, findings from what is known as complexity research are increasingly brought forward as a way of reducing bureaucracy, removing red tape, and maximising the time scientists spend in thinking….

Abandoning the current bureaucratic, top-down system to evaluate and fund research, based on labour-intensive peer-review, may not be too much of a loss. “Peer-review is an imperfect, fragile mechanism. Our simulations show that assigning funds at random would not distort too much the results of the traditional mechanism,” says Flaminio Squazzoni, an economist at the University of Brescia, Italy, and the coordinator of the PEERE-New Frontiers of Peer Review COST action.

In reality peer-review is never quite neutral. “If scientists behave perfectly, then peer review works,” Squazzoni explains, “but if strategic motivations are taken into account, like saving time or competition, then the results are worse than random.” Squazzoni believes that automation, economic incentives, or the creation of professional reviewers may improve the situation….(More)”