Crowdsourcing Expertise


Simons Foundation: “Ever wish there was a quick, easy way to connect your research to the public?

By hosting a Wikipedia ‘edit-a-thon’ at a science conference, you can instantly share your research knowledge with millions while improving the science content on the most heavily trafficked and broadly accessible resource in the world. In 2016, in partnership with the Wiki Education Foundation, we helped launched the Wikipedia Year of Science, an ambitious initiative designed to better connect the work of scientists and students to the public. Here, we share some of what we learned.

The Simons Foundation — through its Science Sandbox initiative, dedicated to public engagement — co-hosted a series of Wikipedia edit-a-thons throughout 2016 at almost every major science conference, in collaboration with the world’s leading scientific societies and associations.

At our edit-a-thons, we leveraged the collective brainpower of scientists, giving them basic training on Wikipedia guidelines and facilitating marathon editing sessions — powered by free pizza, coffee and sometimes beer — during which they made copious contributions within their respective areas of expertise.

These efforts, combined with the Wiki Education Foundation’s powerful classroom model, have had a clear impact. To date, we’ve reached over 150 universities including more than 6,000 students and scientists. As for output, 6,306 articles have been created or edited, garnering more than 304 million views; over 2,000 scientific images have been donated; and countless new scientist-editors have been minted, many of whom will likely continue to update Wikipedia content. The most common response we got from scientists and conference organizers about the edit-a-thons was: “Can we do that again next year?”

That’s where this guide comes in.

Through collaboration, input from Wikipedians and scientists, and more than a little trial and error, we arrived at a model that can help you organize your own edit-a-thons. This informal guide captures our main takeaways and lessons learned….Our hope is that edit-a-thons will become another integral part of science conferences, just like tweetups, communication workshops and other recent outreach initiatives. This would ensure that the content of the public’s most common gateway to science research will continually improve in quality and scope.

Download: “Crowdsourcing Expertise: A working guide for organizing Wikipedia edit-a-thons at science conferences

Education startup helps refugees earn university degree


Springwise: “Berlin-based Kiron works with refugee students to put together an online course of study, rigorous enough to provide entry into a partner university’s second year of study. Using Massive Open Online Courses (MOOCs), Kiron helps students master their new country’s language while studying basic prerequisites for a chosen university degree. Already working with more than 1,500 students in Germany, Kiron recently expanded into France.

With less than one percent of all refugees able to access higher education, MOOCs help get new students to the necessary level of knowledge for in-person university study. Kiron also provides off-line support including study buddy programs and career guidance. Once a participant completes the two-year online program, he or she has the opportunity to enroll for free (as a second year student) in one of Kiron’s partner university’s programs.

A number of projects are finding ways to use the talents of refugees to help them integrate into their country through knowledge-sharing and employment opportunities. Locals and refugees work together in this new Dutch ideas hub, and this French catering company hires refugee chefs….(More)”.

Thesis, antithesis and synthesis: A constructive direction for politics and policy after Brexit and Trump


Geoff Mulgan at Nesta: “In the heady days of 1989, with communism collapsing and the Cold War seemingly over, the political theorist Francis Fukuyama declared that we were witnessing the “end of history” which had culminated in the triumph of liberal democracy and the free market.

Fukuyama was drawing on the ideas of German philosopher Georg Hegel, but of course, history didn’t come to an end, and, as recent events have shown, the Cold War was just sleeping, not dead.

Now, following the political convulsions of 2016, we’re at a very different turning point, which many are trying to make sense of. I want to suggest that we can again usefully turn to Hegel, but this time to his idea that history evolves in dialectical ways, with successive phases of thesis, antithesis and synthesis.

This framework fits well with where we stand today.  The ‘thesis’ that has dominated mainstream politics for the last generation – and continues to be articulated shrilly by many proponents – is the claim that the combination of globalisation, technological progress and liberalisation empowers the great majority.

The antithesis, which, in part, fuelled the votes for Brexit and Trump, as well as the rise of populist parties and populist authoritarian leaders in Europe and beyond, is the argument that this technocratic combination merely empowers a minority and disempowers the majority of citizens.

A more progressive synthesis – which I will outline – then has to address the flaws of the thesis and the grievances of the antithesis, in fields ranging from education and health to democracy and migration, dealing head on with questions of power and its distribution: questions about who has power, and who feels powerful….(More)”

Open innovation in the public sector


Sabrina Diaz Rato in OpenDemocracy: “For some years now, we have been witnessing the emergence of relational, cross-over, participative power. This is the territory that gives technopolitics its meaning and prominence, the basis on which a new vision of democracy – more open, more direct, more interactive – is being developed and embraced. It is a framework that overcomes the closed architecture on which the praxis of governance (closed, hierarchical, one-way) have been cemented in almost all areas. The series The ecosystem of open democracy explores the different aspects of this ongoing transformation….

How can innovation contribute to building an open democracy? The answer is summed up in these ten connectors of innovation.

  1. placing innovation and collective intelligence at the center of public management strategies,
  2. aligning all government areas with clearly-defined goals on associative platforms,
  3. shifting the frontiers of knowledge and action from the institutions to public deliberation on local challenges,
  4. establishing leadership roles, in a language that everyone can easily understand, to organize and plan the wealth of information coming out of citizens’ ideas and to engage those involved in the sustainability of the projects,
  5. mapping the ecosystem and establishing dynamic relations with internal and, particularly, external agents: the citizens,
  6. systematizing the accumulation of information and the creative processes, while communicating progress and giving feedback to the whole community,
  7. preparing society as a whole to experience a new form of governance of the common good,
  8. cooperating with universities, research centers and entrepreneurs in establishing reward mechanisms,
  9. aligning people, technologies, institutions and the narrative with the new urban habits, especially those related to environmental sustainability and public services,
  10. creating education and training programs in tune with the new skills of the 21st century,
  11. building incubation spaces for startups responding to local challenges,
  12. inviting venture capital to generate a satisfactory mix of open innovation, inclusive development policies and local productivity.

Two items in this list are probably the determining factors of any effective innovation process. The first has to do with the correct decision on the mechanisms through which we have pushed the boundaries outwards, so as to bring citizen ideas into the design and co-creation of solutions. This is not an easy task, because it requires a shared organizational mentality on previously non-existent patterns of cooperation, which must now be sustained through dialog and operational dynamics aimed at solving problems defined by external actors – not just any problem.

Another key aspect of the process, related to the breaking down of the institutional barriers that surround and condition action frameworks, is the revaluation of a central figure that we have not yet mentioned here: the policy makers. They are not exactly political leaders or public officials. They are not innovators either. They are the ones within Public Administration who possess highly valuable management skills and knowledge, but who are constantly colliding against the glittering institutional constellations that no longer work….(More)”

Big and open data are prompting a reform of scientific governance


Sabina Leonelli in Times Higher Education: “Big data are widely seen as a game-changer in scientific research, promising new and efficient ways to produce knowledge. And yet, large and diverse data collections are nothing new – they have long existed in fields such as meteorology, astronomy and natural history.

What, then, is all the fuss about? In my recent book, I argue that the true revolution is in the status accorded to data as research outputs in their own right. Along with this has come an emphasis on open data as crucial to excellent and reliable science.

Previously – ever since scientific journals emerged in the 17th century – data were private tools, owned by the scientists who produced them and scrutinised by a small circle of experts. Their usefulness lay in their function as evidence for a given hypothesis. This perception has shifted dramatically in the past decade. Increasingly, data are research components that can and should be made publicly available and usable.

Rather than the birth of a data-driven epistemology, we are thus witnessing the rise of a data-centric approach in which efforts to mobilise, integrate and visualise data become contributions to discovery, not just a by-product of hypothesis testing.

The rise of data-centrism highlights the challenges involved in gathering, classifying and interpreting data, and the concepts, technologies and social structures that surround these processes. This has implications for how research is conducted, organised, governed and assessed.

Data-centric science requires shifts in the rewards and incentives provided to those who produce, curate and analyse data. This challenges established hierarchies: laboratory technicians, librarians and database managers turn out to have crucial skills, subverting the common view of their jobs as marginal to knowledge production. Ideas of research excellence are also being challenged. Data management is increasingly recognised as crucial to the sustainability and impact of research, and national funders are moving away from citation counts and impact factors in evaluations.

New uses of data are forcing publishers to re-assess their business models and dissemination procedures, and research institutions are struggling to adapt their management and administration.

Data-centric science is emerging in concert with calls for increased openness in research….(More)”

Why Big Data Is a Big Deal for Cities


John M. Kamensky in Governing: “We hear a lot about “big data” and its potential value to government. But is it really fulfilling the high expectations that advocates have assigned to it? Is it really producing better public-sector decisions? It may be years before we have definitive answers to those questions, but new research suggests that it’s worth paying a lot of attention to.

University of Kansas Prof. Alfred Ho recently surveyed 65 mid-size and large cities to learn what is going on, on the front line, with the use of big data in making decisions. He found that big data has made it possible to “change the time span of a decision-making cycle by allowing real-time analysis of data to instantly inform decision-making.” This decision-making occurs in areas as diverse as program management, strategic planning, budgeting, performance reporting and citizen engagement.

Cities are natural repositories of big data that can be integrated and analyzed for policy- and program-management purposes. These repositories include data from public safety, education, health and social services, environment and energy, culture and recreation, and community and business development. They include both structured data, such as financial and tax transactions, and unstructured data, such as recorded sounds from gunshots and videos of pedestrian movement patterns. And they include data supplied by the public, such as the Boston residents who use a phone app to measure road quality and report problems.

These data repositories, Ho writes, are “fundamental building blocks,” but the challenge is to shift the ownership of data from separate departments to an integrated platform where the data can be shared.

There’s plenty of evidence that cities are moving in that direction and that they already are systematically using big data to make operational decisions. Among the 65 cities that Ho examined, he found that 49 have “some form of data analytics initiatives or projects” and that 30 have established “a multi-departmental team structure to do strategic planning for these data initiatives.”….The effective use of big data can lead to dialogs that cut across school-district, city, county, business and nonprofit-sector boundaries. But more importantly, it provides city leaders with the capacity to respond to citizens’ concerns more quickly and effectively….(More)”

Mapping open data governance models: Who makes decisions about government data and how?


Ana Brandusescu, Danny Lämmerhirt and Stefaan Verhulst call for a systematic and comparative investigation of the different governance models for open data policy and publication….

“An important value proposition behind open data involves increased transparency and accountability of governance. Yet little is known about how open data itself is governed. Who decides and how? How accountable are data holders to both the demand side and policy makers? How do data producers and actors assure the quality of government data? Who, if any, are data stewards within government tasked to make its data open?

Getting a better understanding of open data governance is not only important from an accountability point of view. If there is a better insight of the diversity of decision-making models and structures across countries, the implementation of common open data principles, such as those advocated by the International Open Data Charter, can be accelerated across countries.

In what follows, we seek to develop the initial contours of a research agenda on open data governance models. We start from the premise that different countries have different models to govern and administer their activities – in short, different ‘governance models’. Some countries are more devolved in their decision making, while others seek to organize “public administration” activities more centrally. These governance models clearly impact how open data is governed – providing a broad patchwork of different open data governance across the world and making it difficult to identify who the open data decision makers and data gatekeepers or stewards are within a given country.

For example, if one wants to accelerate the opening up of education data across borders, in some countries this may fall under the authority of sub-national government (such as states, provinces, territories or even cities), while in other countries education is governed by central government or implemented through public-private partnership arrangements. Similarly, transportation or water data may be privatised, while in other cases it may be the responsibility of municipal or regional government. Responsibilities are therefore often distributed across administrative levels and agencies affecting how (open) government data is produced, and published….(More)”

Information for accountability: Transparency and citizen engagement for improved service delivery in education systems


Lindsay Read and Tamar Manuelyan Atinc at Brookings: “There is a wide consensus among policymakers and practitioners that while access to education has improved significantly for many children in low- and middle-income countries, learning has not kept pace. A large amount of research that has attempted to pinpoint the reasons behind this quality deficit in education has revealed that providing extra resources such as textbooks, learning materials, and infrastructure is largely ineffective in improving learning outcomes at the system level without accompanying changes to the underlying structures of education service delivery and associated systems of accountability.

Information is a key building block of a wide range of strategies that attempts to tackle weaknesses in service delivery and accountability at the school level, even where political systems disappoint at the national level. The dissemination of more and better quality information is expected to empower parents and communities to make better decisions in terms of their children’s schooling and to put pressure on school administrators and public officials for making changes that improve learning and learning environments. This theory of change underpins both social accountability and open data initiatives, which are designed to use information to enhance accountability and thereby influence education delivery.

This report seeks to extract insight into the nuanced relationship between information and accountability, drawing upon a vast literature on bottom-up efforts to improve service delivery, increase citizen engagement, and promote transparency, as well as case studies in Australia, Moldova, Pakistan, and the Philippines. In an effort to clarify processes and mechanisms behind information-based reforms in the education sector, this report also categorizes and evaluates recent impact evaluations according to the intensity of interventions and their target change agents—parents, teachers, school principals, and local officials. The idea here is not just to help clarify what works but why reforms work (or do not)….(More)”

How States Engage in Evidence-Based Policymaking


The Pew Charitable Trusts: “Evidence-based policymaking is the systematic use of findings from program evaluations and outcome analyses (“evidence”) to guide government policy and funding decisions. By focusing limited resources on public services and programs that have been shown to produce positive results, governments can expand their investments in more cost-effective options, consider reducing funding for ineffective programs, and improve the outcomes of services funded by taxpayer dollars.

While the term “evidence-based policymaking” is growing in popularity in state capitols, there is limited information about the extent to which states employ the approach. This report seeks to address this gap by: 1) identifying six distinct actions that states can use to incorporate research findings into their decisions, 2) assessing the prevalence and level of these actions within four human service policy areas across 50 states and the District of Columbia, and 3) categorizing each state based on the final results….

Although many states are embracing evidence-based policymaking, leaders often face challenges in embedding this approach into the decision-making process of state and local governments. This report identifies how staff and stakeholder education, strong data infrastructure, and analytical and technical capacity can help leaders build and sustain support for this work and achieve better outcomes for their communities.

State policymaking

Crowdsourcing Medical Data Through Gaming


Felix Morgan in The Austin Chronicle: “Video games have changed the way we play, but they also have the potential to change the way we research and solve problems, in fields such as health care and education. One game that’s made waves in medical research is Sea Hero Quest. This smartphone game has created a groundbreaking approach to data collection, leading to an earlier diagnosis of dementia. So far, 2.5 million people have played the game, providing scientists with years’ worth of data across borders and demographics.

By offering this game as a free mobile app, researchers are overcoming the ever-present problems of small sample sizes and time-consuming data gathering in empirical research. Sea Hero Quest was created by Glitchers, partnering with University College London, University of East Anglia, and Alzheimer’s Research. As players navigate mazes, shoot flares into baskets, and photograph sea creatures, they answer simple demographic questions and generate rich data sets.

“The idea of crowdsourced data-gathering games for research is a new and exciting method of obtaining data that would be prohibitively expensive otherwise,” says Paul Toprac, who along with his colleague Matt O’Hair, run the Simulation and Game Applications (SAGA) Lab at University of Texas Austin. Their team helps researchers across campus and in the private sector design, implement, and find funding for video game-based research.

O’Hair sees a lot of potential for Sea Hero Quest and other research-based games. “One of the greatest parts about the SAGA Lab is that we get to help researchers make strides in these kinds of fields,” he says.

The idea of using crowdsourcing for data collection is relatively new, but using gaming for research is something that has been well established. Last year at SXSW, Nolan Bushnell, the founder of Atari, made a statement that video games were the key to understanding and treating dementia and related issues, which certainly seems possible based on the preliminary results from Sea Hero Quest. “We have had about 35 years of research using games as a medium,” Toprac says. “However, only recently have we used games as a tool for explicit data gathering.”…(More)”