Global innovations in measurement and evaluation


Report by Andrew WestonAnne KazimirskiAnoushka KenleyRosie McLeodRuth Gripper: “Measurement and evaluation is core to good impact practice. It helps us understand what works, how it works and how we can achieve more. Good measurement and evaluation involves reflective, creative, and proportionate approaches. It makes the most of existing theoretical frameworks as well as new digital solutions, and focuses on learning and improving. We researched the latest changes in theory and practice based on both new and older, renascent ideas. We spoke to leading evaluation experts from around the world, to ask what’s exciting them, what people are talking about and what is most likely to make a long lasting contribution to evaluation. And we found that new thinking, techniques, and technology are influencing and improving practice.

Technology is enabling us to gather different types of data on bigger scales, helping us gain insights or spot patterns we could not see before. Advances in systems to capture, manage and share sensitive data are helping organisations that want to work collaboratively, while moves towards open data are providing better access to data that can be linked together to generate even greater insight. Traditional models of evaluating a project once it has finished are being overtaken by methods that feed more dynamically into service design. We are learning from the private sector, where real-time feedback shapes business decisions on an ongoing basis asking: ‘is it working?’ instead of ‘did it work?’.

And approaches that focus on assessing not just if something works but how and why, for whom, and under what conditions are also generating more insight into the effectiveness of programmes. Technology may be driving many of the innovations we highlight here, but some of the most exciting developments are happening because of changes in the ideologies and cultures that inform our approach to solving big problems. This is resulting in an increased focus on listening to and involving users, and on achieving change at a systemic level—with technology simply facilitating these changes.

Some of the pressures that compel measurement and evaluation activity remain misguided. For example, there can be too big a focus on obtaining a cost-benefit ratio—regardless of the quality of the data it is based on—and not enough encouragement from funders for charities to learn from their evaluation activity. Even the positive developments have their pitfalls: new technologies pose new data protection risks, ethical hazards, and the possibility of exclusion if participation requires high levels of technical ability. It is important that, as the field develops and capabilities increase, we remain focused on achieving best practice.

This report highlights the developments that we think have the greatest potential to improve evaluation and programme design, and the careful collection and use of data. We want to celebrate what is possible, and encourage wider application of these ideas. Choosing the innovations In deciding which trends to include in this report, we considered how different approaches contributed to better evaluation by:

  • overcoming previous barriers to good evaluation practice, eg, through new technologies or skills;
  • providing more meaningful or robust data;
  • using data to support decision-making, learning and improving practice;
  • increasing equality between users, service deliverers and funders; and
  • offering new contexts for collaboration that improve the utility of data.

… Eight key trends emerged from our research that we thought to be most exciting, relevant and likely to have a long-lasting contribution. Some of these are driven by cutting-edge technology; others reflect growing application of ideas that push practice beyond ‘traditional’ models of evaluation. User-centric and shared approaches are leading to better informed measurement and evaluation design. Theory-based evaluation and impact management embolden us to ask better research questions and obtain more useful answers. Data linkage, the availability of big data, and the possibilities arising from remote sensing are increasing the number of questions we can answer. And data visualisation opens up doors to better understanding and communication of this data. Here we present each of these eight innovations and showcase examples of how organisations are using them to better understand and improve their work….(More)”

Digital transformation’s people problem


Jen Kelchner at open source: …Arguably, the greatest chasm we see in our organizational work today is the actual transformation before, during, or after the implementation of a digital technology—because technology invariably crosses through and impacts people, processes, and culture. What are we transforming from? What are we transforming into? These are “people issues” as much as they are “technology issues,” but we too rarely acknowledge this.

Operating our organizations on open principles promises to spark new ways of thinking that can help us address this gap. Over the course of this three-part series, we’ll take a look at how the principle foundations of open play a major role in addressing the “people part” of digital transformation—and closing that gap before and during implementations.

The impact of digital transformation

The meaning of the term “digital transformation” has changed considerably in the last decade. For example, if you look at where organizations were in 2007, you’d watch them grapple with the first iPhone. Focus here was more on search engines, data mining, and methods of virtual collaboration.

A decade later in 2017, however, we’re investing in artificial intelligence, machine learning, and the Internet of Things. Our technologies have matured—but our organizational and cultural structures have not kept pace with them.

Value Co-creation In The Organizations of the Future, a recent research report from Aalto University, states that digital transformation has created opportunities to revolutionize and change existing business models, socioeconomic structures, legal and policy measures, organizational patterns, and cultural barriers. But we can only realize this potential if we address both the technological and the organizational aspects of digital transformation.

Four critical areas of digital transformation

Let’s examine four crucial elements involved in any digital transformation effort:

  • change management
  • the needs of the ecosystem
  • processes
  • silos

Any organization must address these four elements in advance of (ideally) or in conjunction with the implementation of a new technology if that organization is going to realize success and sustainability….(More)”.

Innovation@DFID: Crowdsourcing New Ideas at the UK’s Department for International Development


Paper by Anke Schwittay and Paul Braund: “Over the last decade, traditional development institutions have joined market-based actors in embracing inclusive innovation to ensure the sector’s relevance and impacts. In 2014, the UK’s Department for International Development’s (DFID) Innovation Hub launched Amplify as its own flagship initiative. The programme, which is managed by IDEO, a Silicon Valley-based design consultancy, aims to crowdsource new ideas to various development challenges from a broad and diverse group of actors, including poor people themselves. By examining the direction, diversity and distribution of Amplify’s work, we argue that while development innovation can generate more inclusive practices, its transformative potential is constrained by broader developmental logics and policy regimes….(More)”

We have unrealistic expectations of a tech-driven future utopia


Bob O’Donnell in RECODE: “No one likes to think about limits, especially in the tech industry, where the idea of putting constraints on almost anything is perceived as anathema.

In fact, the entire tech industry is arguably built on the concept of bursting through limitations and enabling things that weren’t possible before. New technology developments have clearly created incredible new capabilities and opportunities, and have generally helped improve the world around us.

But there does come a point — and I think we’ve arrived there — where it’s worth stepping back to both think about and talk about the potential value of, yes, technology limits … on several different levels.

On a technical level, we’ve reached a point where advances in computing applications like AI, or medical applications like gene splicing, are raising even more ethical questions than practical ones on issues such as how they work and for what applications they might be used. Not surprisingly, there aren’t any clear or easy answers to these questions, and it’s going to take a lot more time and thought to create frameworks or guidelines for both the appropriate and inappropriate uses of these potentially life-changing technologies.

Does this mean these kinds of technological advances should be stopped? Of course not. But having more discourse on the types of technologies that get created and released certainly needs to happen.

 Even on a practical level, the need for limiting people’s expectations about what a technology can or cannot do is becoming increasingly important. With science-fiction-like advances becoming daily occurrences, it’s easy to fall into the trap that there are no limits to what a given technology can do. As a result, people are increasingly willing to believe and accept almost any kind of statements or predictions about the future of many increasingly well-known technologies, from autonomous driving to VR to AI and machine learning. I hate to say it, but it’s the fake news of tech.

Just as we’ve seen the fallout from fake news on all sides of the political perspective, so, too, are we starting to see that unbridled and unlimited expectations for certain new technologies are starting to have negative implications of their own. Essentially, we’re starting to build unrealistic expectations for a tech-driven nirvana that doesn’t clearly jibe with the realities of the modern world, particularly in the time frames that are often discussed….(More)”.

Crowdsourcing Expertise to Increase Congressional Capacity


Austin Seaborn at Beeck Center: “Members of Congress have close connections with their districts, and information arising from local organizations, such as professional groups, academia, industry as well as constituents with relevant expertise (like retirees, veterans or students) is highly valuable to them.  Today, congressional staff capacity is at a historic low, while at the same time, constituents in districts are often well equipped to address the underlying policy questions that Congress seeks to solve….

In meetings we have had with House and Senate staffers, they repeatedly express both the difficulty managing their substantial area-specific work loads and their interest in finding ways to substantively engage constituents to find good nuggets of information to help them in their roles as policymakers. At the same time, constituents are demanding more transparency and dialogue from their elected representatives. In many cases, our project brings these two together. It allows Members to tap the expertise in their districts while at the same time creating an avenue for constituents to contribute their knowledge and area expertise to the legislative process. It’s a win for constituents and a win for Member of Congress and their staffs.

It is important to note that the United States lags behind other democracies in experimenting with more inclusive methods during the policymaking process. In the United Kingdom, for example, the UK Parliament has experimented with a variety of new digital tools to engage with constituents. These methods range from Twitter hashtags, which are now quite common given the rise in social media use by governments and elected officials, to a variety of web forums on a variety of platforms. Since June of 2015, they have also been doing digital debates, where questions from the general public are crowdsourced and later integrated into a parliamentary debate by the Member of Parliament leading the debate. Estonia, South Africa, Taiwan, France also…notable examples.

One promising new development we hope to explore more thoroughly is the U.S. Library of Congress’s recently announced legislative data App Challenge. This competition is distinct from the many hackathons that have been held on behalf of Congress in the past, in that this challenge seeks new methods not only to innovate, but also to integrate and legislate. In his announcement, the Library’s Chief Information Officer, Bernard A. Barton, Jr., stated, “An informed citizenry is better able to participate in our democracy, and this is a very real opportunity to contribute to a better understanding of the work being done in Washington.  It may even provide insights for the people doing the work around the clock, both on the Hill, and in state and district offices.  Your innovation and integration may ultimately benefit the way our elected officials legislate for our future.” We believe these sorts of new methods will play a crucial role in the future of engaging citizens in their democracies….(More)”.

Modernizing government’s approach to transportation and land use data: Challenges and opportunities


Adie Tomer and Ranjitha Shivaram at Brookings: “In the fields of transportation and land use planning, the public sector has long taken the leading role in the collection, analysis, and dissemination of data. Often, public data sets drawn from traveler diaries, surveys, and supply-side transportation maps were the only way to understand how people move around in the built environment – how they get to work, how they drop kids off at school, where they choose to work out or relax, and so on.

But, change is afoot: today, there are not only new data providers, but also new types of data. Cellphones, GPS trackers, and other navigation devices offer real-time demand-side data. For instance, mobile phone data can point to where distracted driving is a problem and help implement measures to deter such behavior. Insurance data and geo-located police data can guide traffic safety improvements, especially in accident-prone zones. Geotagged photo data can illustrate the use of popular public spaces by locals and tourists alike, enabling greater return on investment from public spaces. Data from exercise apps like Fitbit and Runkeeper can help identify recreational hot spots that attract people and those that don’t.

However, integrating all this data into how we actually plan and build communities—including the transportation systems that move all of us and our goods—will not be easy. There are several core challenges. Limited staff capacity and restricted budgets in public agencies can slow adoption. Governmental procurement policies are stuck in an analog era. Privacy concerns introduce risk and uncertainty. Private data could be simply unavailable to public consumers. And even if governments could acquire all of the new data and analytics that interest them, their planning and investment models must be updated to fully utilize these new resources.

Using a mix of primary research and expert interviews, this report catalogs emerging data sets related to transportation and land use, and assesses the ease by which they can be integrated into how public agencies manage the built environment. It finds that there is reason for the hype; we have the ability to know more about how humans move around today than at any time in history. But, despite all the obvious opportunities, not addressing core challenges will limit public agencies’ ability to put all that data to use for the collective good….(More)”

From binoculars to big data: Citizen scientists use emerging technology in the wild


Interview by Rebecca Kondos: “For years, citizen scientists have trekked through local fields, rivers, and forests to observe, measure, and report on species and habitats with notebooks, binoculars, butterfly nets, and cameras in hand. It’s a slow process, and the gathered data isn’t easily shared. It’s a system that has worked to some degree, but one that’s in need of a technology and methodology overhaul.

Thanks to the team behind Wildme.org and their Wildbook software, both citizen and professional scientists are becoming active participants in using AI, computer vision, and big data. Wildbook is working to transform the data collection process, and citizen scientists who use the software have more transparency into conservation research and the impact it’s making. As a result, engagement levels have increased; scientists can more easily share their work; and, most important, endangered species like the whale shark benefit.

In this interview, Colin Kingen, a software engineer for WildBook, (with assistance from his colleagues Jason Holmberg and Jon Van Oast) discusses Wildbook’s work, explains classic problems in field observation science, and shares how Wildbook is working to solve some of the big problems that have plagued wildlife research. He also addresses something I’ve wondered about: why isn’t there an “uberdatabase” to share the work of scientists across all global efforts? The work Kingen and his team are doing exemplifies what can be accomplished when computer scientists with big hearts apply their talents to saving wildlife….(More)”.

Government initiative offers Ghanaians chance for greater participation


Springwise: “Openness and transparency are key ingredients in building an accountable and effective democratic government. An “open” government is transparent, accessible to anyone, anytime, anywhere; and is responsive to new ideas and demands. The key to this is providing access to accurate data to all citizens. However, in many countries, a low rate of citizen participation and involvement has led to poor accountability from government officials. In Ghana, a new project, TransGov, is developing innovative tools to foster participation in local governance of marginalised groups, and improve government accountability to those who need it most.

TransGov’s research found that many Ghanaians were not aware of the status of local development projects, and this has led to a general public apathy, where people felt they had no influence on getting the government to work for them. TransGov created a platform to enhance information disclosure, dissemination and to create ways for citizens to engage with the local leaders in their communities. The TransGov platform allows all citizens to track the progress of government projects in their area and to publish information about those projects. TransGov has four integrated platforms, including a website, mobile app, voice response technology (IVR) and SMS – to allow the participation of people from a wide range of socio-economic backgrounds.

The organization has recently partnered with the government-sponsored Ghana Open Data Initiative, to share resources, tools, and research and hold workshops and seminars. This is aimed to strengthen various government agencies in collecting and managing data for public use. The hope is that making this information more accessible will help create more business opportunities and drive innovation, as well as increasing democratic participation. We have seen this in educational radio broadcasts in Cairo subways and an app that allows citizen feedback on city development….(More)”.

How Africa’s Data Revolution Can Deliver Sustainable Development Outcomes


Donald Mogeni at Huffington Post: “…As a demonstration of this political will, several governments in Africa are blazing the trail in numerous ways. For instance, the Government of Senegal now considers investment in data as important as it would treat investment in physical infrastructure such as roads. In Ghana and Sierra Leone, more policy-makers and legislators are now using data to inform their work and make planning is continuously evidence-based.

Despite the progressive developments, several cautionary statements are worth noting. Firstly, data is not a silver-bullet to addressing present development challenges and/or problems. To be transformative, use of data and evidence must include political agency and citizen mobilization. Thus, while data may highlight important development cleavages, it may not guarantee change if not used appropriately within the various political contexts. ‘Everyone Counts’, a new global initiative by CARE, KWANTU and World Vision (that was also showcased in the meeting) seeks to contribute to this agenda.

Secondly, there is need for data ‘experts’ to move beyond the chronic obsession with big numbers to ensure greater inclusion of marginalised and vulnerable segments of the population. Achieving this will require a ‘business unusual’ approach that devises better data collection methodologies and technologies that must collect more and better than ever before. This ‘new’ data should then be used together with administrative and open data to ensure that ‘no one is left behind’.

Thirdly, the utility of citizen-generated data is still contentious – especially within state institutions. Increasing the value of this data must therefore involve standardization of data collection tools and methodologies across the board (to the extent possible), making consideration for ethical approvals, subjecting this data to quality audits and triangulation, as well as adhering to quality assurance standards.

Fourthly, the emergence of various data communities within African countries has made the roles of National Statistical Offices in the data ecosystem even more crucial. However, significant capacity and technical disparities exist between the various National Statistical Offices (NSOs) in Africa. To realise the potential of data and statistics in achieving sustainable development outcomes, financial and human capacities of these institutions must to be enhanced….(More)”.

Formalised data citation practices would encourage more authors to make their data available for reuse


 Hyoungjoo Park and Dietmar Wolfram at the LSE Impact Blog: “Today’s researchers work in a heavily data-intensive and collaborative environment in order to further scientific discovery across and within fields. It is becoming routine for researchers (i.e. authors and data publishers) to submit their research data, such as datasets, biological samples in biomedical fields, and computer code, as supplementary information in order to comply with data sharing requirements of major funding agencies, high-profile journals, and data journals. This is part of open science, where data and any publication products are expected to be made available to anyone interested.

Given that researchers benefit from publicly shared data through data reuse in their own research, researchers who provide access to data should be acknowledged for their contributions, much in the same way that authors are recognised for their research publications through citation. Researchers who use shared data or other shared research products (e.g. open access software, tissue cultures) should also acknowledge the providers of these resources through formal citation. At present, data citation is not widely practised in most disciplines and as an object of study remains largely overlooked….

We found that data citations appear in the references section of an article less frequently than in the main text, making it difficult to identify the reward and credit for data authors (i.e. data sharers). Consistent data citation formats could not be found. Current data citation practices do not (yet) benefit data sharers. Also, data citation was sometimes located in the supplementary information, outside of the references. Data that had been reused was often not acknowledged in the reference lists, but was rather hidden in the representation of data (e.g. tables, figures, images, graphs, and other elements), which may be a consequence of the fact that data citation practices are not yet common in scholarly communications.

Ongoing challenges remain in identifying and documenting data citation. First, the practice of informal data citation presents a challenge for accurately documenting data citation. …

Second, data recitation by one or more co-authors of earlier studies (i.e. self-citation) is common, which reduces the broader impact of data sharing by limiting much of the reuse to the original authors..

Third, currently indexed data citations may not include rapidly advancing areas, such as in the hard sciences or computer engineering, because approximately 90% of indexed works were associated with journal articles…

Fourth, the number of authors associated with shared datasets raises questions of the ownership of and responsibility for a collective work, although some journals require one author to be responsible for the data used in the study…(More). (See also An examination of research data sharing and re-use: implications for data citation practice, published in Scientometrics)