Global innovations in measurement and evaluation


Report by Andrew WestonAnne KazimirskiAnoushka KenleyRosie McLeodRuth Gripper: “Measurement and evaluation is core to good impact practice. It helps us understand what works, how it works and how we can achieve more. Good measurement and evaluation involves reflective, creative, and proportionate approaches. It makes the most of existing theoretical frameworks as well as new digital solutions, and focuses on learning and improving. We researched the latest changes in theory and practice based on both new and older, renascent ideas. We spoke to leading evaluation experts from around the world, to ask what’s exciting them, what people are talking about and what is most likely to make a long lasting contribution to evaluation. And we found that new thinking, techniques, and technology are influencing and improving practice.

Technology is enabling us to gather different types of data on bigger scales, helping us gain insights or spot patterns we could not see before. Advances in systems to capture, manage and share sensitive data are helping organisations that want to work collaboratively, while moves towards open data are providing better access to data that can be linked together to generate even greater insight. Traditional models of evaluating a project once it has finished are being overtaken by methods that feed more dynamically into service design. We are learning from the private sector, where real-time feedback shapes business decisions on an ongoing basis asking: ‘is it working?’ instead of ‘did it work?’.

And approaches that focus on assessing not just if something works but how and why, for whom, and under what conditions are also generating more insight into the effectiveness of programmes. Technology may be driving many of the innovations we highlight here, but some of the most exciting developments are happening because of changes in the ideologies and cultures that inform our approach to solving big problems. This is resulting in an increased focus on listening to and involving users, and on achieving change at a systemic level—with technology simply facilitating these changes.

Some of the pressures that compel measurement and evaluation activity remain misguided. For example, there can be too big a focus on obtaining a cost-benefit ratio—regardless of the quality of the data it is based on—and not enough encouragement from funders for charities to learn from their evaluation activity. Even the positive developments have their pitfalls: new technologies pose new data protection risks, ethical hazards, and the possibility of exclusion if participation requires high levels of technical ability. It is important that, as the field develops and capabilities increase, we remain focused on achieving best practice.

This report highlights the developments that we think have the greatest potential to improve evaluation and programme design, and the careful collection and use of data. We want to celebrate what is possible, and encourage wider application of these ideas. Choosing the innovations In deciding which trends to include in this report, we considered how different approaches contributed to better evaluation by:

  • overcoming previous barriers to good evaluation practice, eg, through new technologies or skills;
  • providing more meaningful or robust data;
  • using data to support decision-making, learning and improving practice;
  • increasing equality between users, service deliverers and funders; and
  • offering new contexts for collaboration that improve the utility of data.

… Eight key trends emerged from our research that we thought to be most exciting, relevant and likely to have a long-lasting contribution. Some of these are driven by cutting-edge technology; others reflect growing application of ideas that push practice beyond ‘traditional’ models of evaluation. User-centric and shared approaches are leading to better informed measurement and evaluation design. Theory-based evaluation and impact management embolden us to ask better research questions and obtain more useful answers. Data linkage, the availability of big data, and the possibilities arising from remote sensing are increasing the number of questions we can answer. And data visualisation opens up doors to better understanding and communication of this data. Here we present each of these eight innovations and showcase examples of how organisations are using them to better understand and improve their work….(More)”

How AI Is Crunching Big Data To Improve Healthcare Outcomes


PSFK: “The state of your health shouldn’t be a mystery, nor should patients or doctors have to wait long to find answers to pressing medical concerns. In PSFK’s Future of Health Report, we dig deep into the latest in AI, big data algorithms and IoT tools that are enabling a new, more comprehensive overview of patient data collection and analysis. Machine support, patient information from medical records and conversations with doctors are combined with the latest medical literature to help form a diagnosis without detracting from doctor-patient relations.

The impact of improved AI helps patients form a baseline for well-being and is making changes all across the healthcare industry. AI not only streamlines intake processes and reduces processing volume at clinics, it also controls input and diagnostic errors within a patient record, allowing doctors to focus on patient care and communication, rather than data entry. AI also improves pattern recognition and early diagnosis by learning from multiple patient data sets.

By utilizing deep learning algorithms and software, healthcare providers can connect various libraries of medical information and scan databases of medical records, spotting patterns that lead to more accurate detection and greater breadth of efficiency in medical diagnosis and research. IBM Watson, which has previously been used to help identify genetic markers and develop drugs, is applying its neural learning networks to help doctors correctly diagnose heart abnormalities from medical imaging tests. By scanning thousands of images and learning from correct diagnoses, Watson is able to increase diagnostic accuracy, supporting doctors’ cardiac assessments.

Outside of the doctor’s office, AI is also being used to monitor patient vitals to help create a baseline for well-being. By monitoring health on a day-to-day basis, AI systems can alert patients and medical teams to abnormalities or changes from the baseline in real time, increasing positive outcomes. Take xbird, a mobile platform that uses artificial intelligence to help diabetics understand when hypoglycemic attacks will occur. The AI combines personal and environmental data points from over 20 sensors within mobile and wearable devices to create an automated personal diary and cross references it against blood sugar levels. Patients then share this data with their doctors in order to uncover their unique hypoglycemic triggers and better manage their condition.

In China, meanwhile, web provider Baidu has debuted Melody, a chat-based medical assistant that helps individuals communicate their symptoms, learn of possible diagnoses and connect to medical experts….(More)”.

From binoculars to big data: Citizen scientists use emerging technology in the wild


Interview by Rebecca Kondos: “For years, citizen scientists have trekked through local fields, rivers, and forests to observe, measure, and report on species and habitats with notebooks, binoculars, butterfly nets, and cameras in hand. It’s a slow process, and the gathered data isn’t easily shared. It’s a system that has worked to some degree, but one that’s in need of a technology and methodology overhaul.

Thanks to the team behind Wildme.org and their Wildbook software, both citizen and professional scientists are becoming active participants in using AI, computer vision, and big data. Wildbook is working to transform the data collection process, and citizen scientists who use the software have more transparency into conservation research and the impact it’s making. As a result, engagement levels have increased; scientists can more easily share their work; and, most important, endangered species like the whale shark benefit.

In this interview, Colin Kingen, a software engineer for WildBook, (with assistance from his colleagues Jason Holmberg and Jon Van Oast) discusses Wildbook’s work, explains classic problems in field observation science, and shares how Wildbook is working to solve some of the big problems that have plagued wildlife research. He also addresses something I’ve wondered about: why isn’t there an “uberdatabase” to share the work of scientists across all global efforts? The work Kingen and his team are doing exemplifies what can be accomplished when computer scientists with big hearts apply their talents to saving wildlife….(More)”.

Open data on universities – New fuel for transformation


François van Schalkwyk at University World News: “Accessible, usable and relevant open data on South African universities makes it possible for a wide range of stakeholders to monitor, advise and challenge the transformation of South Africa’s universities from an informed perspective.

Some describe data as the new oil while others suggest it is a new form of capital or compare it to electricity. Either way, there appears to be a groundswell of interest in the potential of data to fuel development.

Whether the proliferation of data is skewing development in favour of globally networked elites or disrupting existing asymmetries of information and power, is the subject of ongoing debate. Certainly, there are those who will claim that open data, from a development perspective, could catalyse disruption and redistribution.

Open data is data that is free to use without restriction. Governments and their agencies, universities and their researchers, non-governmental organisations and their donors, and even corporations, are all potential sources of open data.

Open government data, as a public rather than a private resource, embedded in principles of universal access, participation and transparency, is touted as being able to restore the deteriorating levels of trust between citizens and their governments.

Open data promises to do so by making the decisions and processes of the state more transparent and inclusive, empowering citizens to participate and to hold public institutions to account for the distribution of public services and resources.

Benefits of open data

Open data has other benefits over its more cloistered cousins (data in private networks, big data, etc). By democratising access, open data makes possible the use of data on, for example, health services, crime, the environment, procurement and education by a range of different users, each bringing their own perspective to bear on the data. This can expose bias in the data or may improve the quality of the data by surfacing data errors. Both are important when data is used to shape government policies.

By removing barriers to reusing data such as copyright or licence-fees, tech-savvy entrepreneurs can develop applications to assist the public to make more informed decisions by making available easy-to-understand information on medicine prices, crime hot-spots, air quality, beneficial ownership, school performance, etc. And access to open research data can improve quality and efficiency in science.

Scientists can check and confirm the data on which important discoveries are based if the data is open, and, in some cases, researchers can reuse open data from other studies, saving them the cost and effort of collecting the data themselves.

‘Open washing’

But access alone is not enough for open data to realise its potential. Open data must also be used. And data is used if it holds some value for the user. Governments have been known to publish server rooms full of data that no one is interested in to support claims of transparency and supporting the knowledge economy. That practice is called ‘open washing’. …(More)”

Bangalore Taps Tech Crowdsourcing to Fix ‘Unruly’ Gridlock


Saritha Rai at Bloomberg Technology: “In Bangalore, tech giants and startups typically spend their days fiercely battling each other for customers. Now they are turning their attention to a common enemy: the Indian city’s infernal traffic congestion.

Cross-town commutes that can take hours has inspired Gridlock Hackathon, a contest initiated by Flipkart Online Services Pvt. for technology workers to find solutions to the snarled roads that cost the economy billions of dollars. While the prize totals a mere $5,500, it’s attracting teams from global giants Microsoft Corp., Google and Amazon.com. Inc. to local startups including Ola.

The online contest is crowdsourcing solutions for Bangalore, a city of more than 10 million, as it grapples with inadequate roads, unprecedented growth and overpopulation. The technology industry began booming decades ago and with its base of talent, it continues to attract companies. Just last month, Intel Corp. said it would invest $178 million and add more workers to expand its R&D operations.

The ideas put forward at the hackathon range from using artificial intelligence and big data on traffic flows to true moonshots, such as flying cars.

The gridlock remains a problem for a city dependent on its technology industry and seeking to attract new investment…(More)”.

Open data: Accountability and transparency


 at Big Data and Society: “The movements by national governments, funding agencies, universities, and research communities toward “open data” face many difficult challenges. In high-level visions of open data, researchers’ data and metadata practices are expected to be robust and structured. The integration of the internet into scientific institutions amplifies these expectations. When examined critically, however, the data and metadata practices of scholarly researchers often appear incomplete or deficient. The concepts of “accountability” and “transparency” provide insight in understanding these perceived gaps. Researchers’ primary accountabilities are related to meeting the expectations of research competency, not to external standards of data deposition or metadata creation. Likewise, making data open in a transparent way can involve a significant investment of time and resources with no obvious benefits. This paper uses differing notions of accountability and transparency to conceptualize “open data” as the result of ongoing achievements, not one-time acts….(More)”.

Justice in Algorithmic Robes


Editorial by Joseph Savirimuthu of a Special Issue of the International Review of Law, Computers & Technology: “The role and impact of algorithms has attracted considerable interest in the media. Its impact is already being reflected in adjustments made in a number of sectors – entertainment, travel, transport, cities and financial services. From an innovation point of view, algorithms enable new knowledge to be created and identify solutions to problems. The emergence of smart sensing technologies, 3D printing, automated systems and robotics is seamlessly being interwoven into discourses such as ‘the collaborative economy’, ‘governance by platforms’ and ‘empowerment’. Innovations such as body worn cameras, fitness trackers, 3D printing, smart meters, robotics and Big Data hold out the promise of a new algorithmic future. However, the shift in focus from natural and scarce resources towards information also makes individuals the objects and the mediated construction of access and knowledge infrastructures now provide the conditions for harnessing value from data. The increasing role of algorithms in environments mediated by technology also coincide with growing inter-disciplinary scholarship voicing concerns about the vulnerability of the values we associate with fundamental freedoms and how these are being algorithmically reconfigured or dismantled in a systematic manner. The themed issue, Justice in Algorithmic Robes, is intended to initiate a dialogue on both the challenges and opportunities as digitalization ushers in a period of transformation that has no immediate parallels in terms of scale, speed and reach. The articles provide different perspectives to the transformation taking place in the digital environment. The contributors offer an inter-disciplinary view of how the digital economy is being invigorated and evaluate the regulatory responses – in particular, how these transformations interact with law. The different spheres covered in Justice in Algorithmic Robes – the relations between the State and individuals, autonomous technology, designing human–computer interactions, infrastructures of trust, accountability in the age of Big Data, and health and wearables – not only reveal the problem of defining spheres of economic, political and social activity, but also highlight how these contexts evolve into structures for dominance, power and control. Re-imagining the role of law does not mean that technology is the problem but the central idea from the contributions is that how we critically interpret and construct Justice in Algorithmic Robes is probably the first step we must take, always mindful of the fact that law may actually reinforce power structures….(Full Issue)”.

Data and the City


Book edited by Rob Kitchin, Tracey P. Lauriault, and Gavin McArdle: “There is a long history of governments, businesses, science and citizens producing and utilizing data in order to monitor, regulate, profit from and make sense of the urban world. Recently, we have entered the age of big data, and now many aspects of everyday urban life are being captured as data and city management is mediated through data-driven technologies.

Data and the City is the first edited collection to provide an interdisciplinary analysis of how this new era of urban big data is reshaping how we come to know and govern cities, and the implications of such a transformation. This book looks at the creation of real-time cities and data-driven urbanism and considers the relationships at play. By taking a philosophical, political, practical and technical approach to urban data, the authors analyse the ways in which data is produced and framed within socio-technical systems. They then examine the constellation of existing and emerging urban data technologies. The volume concludes by considering the social and political ramifications of data-driven urbanism, questioning whom it serves and for what ends.

This book, the companion volume to 2016’s Code and the City, offers the first critical reflection on the relationship between data, data practices and the city, and how we come to know and understand cities through data. It will be crucial reading for those who wish to understand and conceptualize urban big data, data-driven urbanism and the development of smart cities….(More)”

Children and the Data Cycle: Rights And Ethics in a Big Data World


Gabrielle Berman andKerry Albright at UNICEF: “In an era of increasing dependence on data science and big data, the voices of one set of major stakeholders – the world’s children and those who advocate on their behalf – have been largely absent. A recent paper estimates one in three global internet users is a child, yet there has been little rigorous debate or understanding of how to adapt traditional, offline ethical standards for research involving data collection from children, to a big data, online environment (Livingstone et al., 2015). This paper argues that due to the potential for severe, long-lasting and differential impacts on children, child rights need to be firmly integrated onto the agendas of global debates about ethics and data science. The authors outline their rationale for a greater focus on child rights and ethics in data science and suggest steps to move forward, focusing on the various actors within the data chain including data generators, collectors, analysts and end-users. It concludes by calling for a much stronger appreciation of the links between child rights, ethics and data science disciplines and for enhanced discourse between stakeholders in the data chain, and those responsible for upholding the rights of children, globally….(More)”.

Using Collaboration to Harness Big Data for Social Good


Jake Porway at SSIR: “These days, it’s hard to get away from the hype around “big data.” We read articles about how Silicon Valley is using data to drive everything from website traffic to autonomous cars. We hear speakers at social sector conferences talk about how nonprofits can maximize their impact by leveraging new sources of digital information like social media data, open data, and satellite imagery.

Braving this world can be challenging, we know. Creating a data-driven organization can require big changes in culture and process. Some nonprofits, like Crisis Text Line and Watsi, started off boldly by building their own data science teams. But for the many other organizations wondering how to best use data to advance their mission, we’ve found that one ingredient works better than all the software and tech that you can throw at a problem: collaboration.

As a nonprofit dedicated to applying data science for social good, DataKind has run more than 200 projects in collaboration with other nonprofits worldwide by connecting them to teams of volunteer data scientists. What do the most successful ones have in common? Strong collaborations on three levels: with data science experts, within the organization itself, and across the nonprofit sector as a whole.

1. Collaborate with data science experts to define your project. As we often say, finding problems can be harder than finding solutions. ….

2. Collaborate across your organization to “build with, not for.” Our projects follow the principles of human-centered design and the philosophy pioneered in the civic tech world of “design with, not for.” ….

3. Collaborate across your sector to move the needle. Many organizations think about building data science solutions for unique challenges they face, such as predicting the best location for their next field office. However, most of us are fighting common causes shared by many other groups….

By focusing on building strong collaborations on these three levels—with data experts, across your organization, and across your sector—you’ll go from merely talking about big data to making big impact….(More).