Book by Patrick Heller and Vijayendra Rao for the Worldbank: “Deliberation is the process by which a group of people, each with equal voice, can – via a process of discussion and debate – reach an agreement. This book attempts to do two things. First, it rethinks the role of deliberation in development and shows that it has potential well beyond a narrow focus on participatory projects. Deliberation, if properly instituted, has the potential to have a transformative effect on many if not all aspects of development, and especially in addressing problems of collective action, coordination, and entrenched inequality. This has broad implications both at the global and local level. Second, the book demonstrates that taking deliberation seriously calls for a different approach to both research and policy design and requires a much greater emphasis on the processes by which decisions are made, rather than an exclusive focus on the outcomes. Deliberation and Development contributes to a broader literature to understand the role of communicative processes in development….(More)“
Crowdsourcing: a survey of applications
This paper provides an introduction to crowdsourcing, guidelines for using crowdsourcing, and its applications in various fields. Finally, this article proposes conclusion which is based upon applications of crowdsourcing….(More)”.
Setting High and Compatible Standards
Laura Bacon at Omidyar Network: “…Standards enable interoperability, replicability, and efficiency. Airplane travel would be chaotic at best and deadly at worst if flights and air traffic control did not use common codes for call signs, flight numbers, location, date, and time. Trains that cross national borders need tracks built to a standard gauge as evidenced by Spain’s experience in making its trains interoperable with the rest of the continent’s.
Standards matter in data collection and publication as well. This is especially true for those datasets that matter most to people’s lives, such as health, education, agriculture, and water. Disparate standards for basic category definitions like geography and organizations mean that data sources cannot be easily or cost-effectively analyzed for cross-comparison and decision making.
Compatible data standards that enable data being ‘joined up,’ would enable more efficacious logging and use of immunization records, controlling the spread of infectious disease, helping educators prioritize spending based on the greatest needs, and identifying the beneficial owners of companies to help ensure transparent and legal business transactions.
Data: More Valuable When Joined Up
Lots of efforts, time, and money are poured into the generation and publication of open data. And where open data is important in itself, the biggest return on investment is potentially from the inter-linkages among datasets. However, it is very difficult to yield this return because of the now-missing standards and building blocks (e.g., geodata, organizational identifiers, project identifiers) that would enable joining up of data.
Omidyar Network currently supports open data standards for contracting, extractives, budgets, and others. If “joining up” work is not considered and executed at early stages, these standards 1) could evolve in silos and 2) may not reach their full capacity.
Interoperability will not happen automatically; specific investments and efforts must be made to develop the public good infrastructure for the joining up of key datasets….The two organizations leading this project have an impressive track record working in this area. Development Initiatives is a global organization working to empower people to make more effective use of information. In 2013, it commissioned Open Knowledge Foundation to publish a cross-initiative scoping study, Joined-Up Data: Building Blocks for Common Standards, which recommended focus areas, shared learning, and the adoption of joined-up data and common standards for all publishers. Partnering with Development Initiatives is Publish What You Fund,…(More)”
From Governmental Open Data Toward Governmental Open Innovation (GOI)
Chapter by Daniele Archibugi et al in The Handbook of Global Science, Technology, and Innovation: “Today, governments release governmental data that were previously hidden to the public. This democratization of governmental open data (OD) aims to increase transparency but also fuels innovation. Indeed, the release of governmental OD is a global trend, which has evolved into governmental open innovation (GOI). In GOI, governmental actors purposively manage the knowledge flows that span organizational boundaries and reveal innovation-related knowledge to the public with the aim to spur innovation for a higher economic and social welfare at regional, national, or global scale. GOI subsumes different revealing strategies, namely governmental OD, problem, and solution revealing. This chapter introduces the concept of GOI that has evolved from global OD efforts. It present a historical analysis of the emergence of GOI in four different continents, namely, Europe (UK and Denmark), North America (United States and Mexico), Australia, and China to highlight the emergence of GOI at a global scale….(More)”
Data, Human Rights & Human Security
Paper by Mark Latonero and Zachary Gold“In today’s global digital ecosystem, mobile phone cameras can document and distribute images of physical violence. Drones and satellites can assess disasters from afar. Big data collected from social media can provide real-time awareness about political protests. Yet practitioners, researchers, and policymakers face unique challenges and opportunities when assessing technological benefit, risk, and harm. How can these technologies be used responsibly to assist those in need, prevent abuse, and protect people from harm?”
Mark Latonero and Zachary Gold address the issues in this primer for technologists, academics, business, governments, NGOs, intergovernmental organizations — anyone interested in the future of human rights and human security in a data-saturated world….(Download PDF)”
Who knew contracts could be so interesting?
Steve Goodrich at Transparency International UK: “…Despite the UK Government’s lack of progress, it wouldn’t be completely unreasonable to ask “who actually publishes these things, anyway?” Well, back in 2011, when the UK Government committed to publish all new contracts and tenders over £10,000 in value, the Slovakian Government decided to publish more or less everything. Faced by mass protests over corruption in the public sector, their government committed to publishing almost all public sector contracts online (there are some exemptions). You can now browse through the details of a significant amount of government business via the country’s online portal (so long as you can read Slovak, of course).
Who actually reads these things?
According to research by Transparency International Slovakia, at least 11% of the Slovakian adult population have looked at a government contract since they were first published back in 2011. That’s around 480,000 people. Although some of these spent more time than others browsing through the documents in-depth, this is undeniably an astounding amount of people taking a vague interest in government procurement.
Why does this matter?
Before Slovakia opened-up its contracts there was widespread mistrust in public institutions and officials. According to Transparency International’s global Corruption Perceptions Index, which measures impressions of public sector corruption, Slovakia was ranked 66th out of 183 countries in 2011. By 2014 it had jumped 12 places – a record achievement – to 54th, which must in some part be due to the Government’s commitment to opening-up public contracts to greater scrutiny.
Since the contracts were published, there also seems to have been a spike in media reports on government tenders. This suggests there is greater scrutiny of public spending, which should hopefully translate into less wasted expenditure.
Elsewhere, proponents of open contracting have espoused other benefits, such as greater commitment by both parties to following the agreement and protecting against malign private interests. Similar projects inGeorgia have also turned clunky bureaucracies into efficient, data-savvy administrations. In short, there are quite a few reasons why more openness in public sector procurement is a good thing.
Despite these benefits, opponents cite a number of downsides, including the administrative costs of publishing contracts online and issues surrounding commercially sensitive information. However, TI Slovakia’s research suggests the former is minimal – and presumably preferable to rooting around through paper mountains every time a Freedom of Information (FOI) request is received about a contract – whilst the latter already has to be disclosed under the FOI Act except in particular circumstances…(More)”
Open Innovation, Open Science, Open to the World
Speech by Carlos Moedas, EU Commissioner for Research, Science and Innovation: “On 25 April this year, an earthquake of magnitude 7.3 hit Nepal. To get real-time geographical information, the response teams used an online mapping tool called Open Street Map. Open Street Map has created an entire online map of the world using local knowledge, GPS tracks and donated sources, all provided on a voluntary basis. It is open license for any use.
Open Street Map was created by a 24 year-old computer science student at University College London in 2004, has today 2 million users and has been used for many digital humanitarian and commercial purposes: From the earthquakes in Haiti and Nepal to the Ebola outbreak in West Africa.
This story is one of many that demonstrate that we are moving into a world of open innovation and user innovation. A world where the digital and physical are coming together. A world where new knowledge is created through global collaborations involving thousands of people from across the world and from all walks of life.
Ladies and gentlemen, over the next two days I would like us to chart a new path for European research and innovation policy. A new strategy that is fit for purpose for a world that is open, digital and global. And I would like to set out at the start of this important conference my own ambitions for the coming years….
Open innovation is about involving far more actors in the innovation process, from researchers, to entrepreneurs, to users, to governments and civil society. We need open innovation to capitalise on the results of European research and innovation. This means creating the right ecosystems, increasing investment, and bringing more companies and regions into the knowledge economy. I would like to go further and faster towards open innovation….
I am convinced that excellent science is the foundation of future prosperity, and that openness is the key to excellence. We are often told that it takes many decades for scientific breakthroughs to find commercial application.
Let me tell you a story which shows the opposite. Graphene was first isolated in the laboratory by Profs. Geim and Novoselov at the University of Manchester in 2003 (Nobel Prizes 2010). The development of graphene has since benefitted from major EU support, including ERC grants for Profs. Geim and Novoselov. So I am proud to show you one of the new graphene products that will soon be available on the market.
This light bulb uses the unique thermal dissipation properties of graphene to achieve greater energy efficiencies and a longer lifetime that LED bulbs. It was developed by a spin out company from the University of Manchester, called Graphene Lighting, as is expected to go on sale by the end of the year.
But we must not be complacent. If we look at indicators of the most excellent science, we find that Europe is not top of the rankings in certain areas. Our ultimate goal should always be to promote excellence not only through ERC and Marie Skłodowska-Curie but throughout the entire H2020.
For such an objective we have to move forward on two fronts:
First, we are preparing a call for European Science Cloud Project in order to identify the possibility of creating a cloud for our scientists. We need more open access to research results and the underlying data. Open access publication is already a requirement under Horizon 2020, but we now need to look seriously at open data…
When innovators like LEGO start fusing real bricks with digital magic, when citizens conduct their own R&D through online community projects, when doctors start printing live tissues for patients … Policymakers must follow suit…(More)”
Handbook: How to Catalyze Humanitarian Innovation in Computing Research Institutes
Patrick Meier: “The handbook below provides practical collaboration guidelines for both humanitarian organizations & computing research institutes on how to catalyze humanitarian innovation through successful partnerships. These actionable guidelines are directly applicable now and draw on extensive interviews with leading humanitarian groups and CRI’s including the International Committee of the Red Cross (ICRC), United Nations Office for the Coordination of Humanitarian Affairs (OCHA), United Nations Children’s Fund (UNICEF), United Nations High Commissioner for Refugees (UNHCR), UN Global Pulse, Carnegie Melon University (CMU), International Business Machines (IBM), Microsoft Research, Data Science for Social Good Program at the University of Chicago and others.
This handbook, which is the first of its kind, also draws directly on years of experience and lessons learned from the Qatar Computing Research Institute’s (QCRI) active collaboration and unique partnerships with multiple international humanitarian organizations. The aim of this blog post is to actively solicit feedback on this first, complete working draft, which is available here as an open and editable Google Doc. …(More)”
How Crowdsourcing Can Help Us Fight ISIS
Dr. Maha Hosain Aziz at the Huffington Post: “There’s no question that ISIS is gaining ground. …So how else can we fight ISIS? By crowdsourcing data – i.e. asking a relevant group of people for their input via text or the Internet on specific ISIS-related issues. In fact, ISIS has been using crowdsourcing to enhance its operations since last year in two significant ways. Why shouldn’t we?
First, ISIS is using its crowd of supporters in Syria, Iraq and elsewhere to help strategize new policies. Last December, the extremist group leveraged its global crowd via social media to brainstorm ideas on how to kill 26-year-old Jordanian coalition fighter pilot Moaz al-Kasasba. ISIS supporters used the hashtag “Suggest a Way to Kill the Jordanian Pilot Pig” and “We All Want to Slaughter Moaz” to make their disturbing suggestions, which included decapitation, running al-Kasasba over with a bulldozer and burning him alive (which was the winner). Yes, this sounds absurd and was partly a publicity stunt to boost ISIS’ image. But the underlying strategy to crowdsource new strategies makes complete sense for ISIS as it continues to evolve – which is what the US government should consider as well.
In fact, in February, the US government tried to crowdsource more counterterrorism strategies. Via its official blog, DipNote, the State Departmentasked the crowd – in this case, US citizens – for their suggestions for solutions to fight violent extremism. This inclusive approach to policymaking was obviously important for strengthening democracy, with more than 180 entries posted over two months from citizens across the US. But did this crowdsourcing exercise actually improve US strategy against ISIS? Not really. What might help is if the US government asked a crowd of experts across varied disciplines and industries about counterterrorism strategies specifically against ISIS, also giving these experts the opportunity to critique each other’s suggestions to reach one optimal strategy. This additional, collaborative, competitive and interdisciplinary expert insight can only help President Obama and his national security team to enhance their anti-ISIS strategy.
Second, ISIS has been using its crowd of supporters to collect intelligence information to better execute its strategies. Since last August, the extremist group has crowdsourced data via a Twitter campaign specifically on Saudi Arabia’s intelligence officials, including names and other personal details. This apparently helped ISIS in its two suicide bombing attacks during prayers at a Shite mosque last month; it also presumably helped ISIS infiltrate a Saudi Arabian border town via Iraq in January. This additional, collaborative approach to intelligence collection can only help President Obama and his national security team to enhance their anti-ISIS strategy.
In fact, last year, the FBI used crowdsourcing to spot individuals who might be travelling abroad to join terrorist groups. But what if we asked the crowd of US citizens and residents to give us information specifically on where they’ve seen individuals get lured by ISIS in the country, as well as on specific recruitment strategies they may have noted? This might also lead to more real-time data points on ISIS defectors returning to the US – who are they, why did they defect and what can they tell us about their experience in Syria or Iraq? Overall, crowdsourcing such data (if verifiable) would quickly create a clearer picture of trends in recruitment and defectors across the country, which can only help the US enhance its anti-ISIS strategies.
This collaborative approach to data collection could also be used in Syria and Iraq with texts and online contributions from locals helping us to map ISIS’ movements….(More)”
A Research Roadmap for Human Computation
Emerging Technology From the arXiv : “The wisdom of the crowd has become so powerful and so accessible via the Internet that it has become a resource in its own right. Various services now tap into this rich supply of human cognition, such as Wikipedia, Duolingo, and Amazon’s Mechanical Turk.
So important is this resource that scientists have given it a name; they call it human computation. And a rapidly emerging and increasingly important question is how best to exploit it.
Today, we get an answer of sorts thanks to a group of computer scientists, crowdsourcing pioneers, and visionaries who have created a roadmap for research into human computation. The team, led by Pietro Michelucci at the Human Computation Institute, point out that human computation systems have been hugely successful at tackling complex problems from identifying spiral galaxies to organizing disaster relief.
But their potential is even greater still, provided that human cognition can be efficiently harnessed on a global scale. Last year, they met to discuss these issues and have now published the results of their debate.
The begin by pointing out the extraordinary successes of human computation….then describe the kinds of projects they want to create. They call one idea Project Houston after the crowdsourced effort on the ground that helped bring back the Apollo 13 astronauts after an on-board explosion on the way to the moon.
Their idea is that similar help can be brought to bear from around the world when individuals on earth find themselves in trouble. By this they mean individuals who might be considering suicide or suffering from depression, for example.
The plan is to use state-of-the-art speech analysis and natural language understanding to detect stress and offer help. This would come in the form of composite personalities made up from individuals with varying levels of expertise in the crowd, supported by artificial intelligence techniques. “Project Houston could provide a consistently kind and patient personality even if the “crowd” changes completely over time,” they say.
Another idea is to build on the way that crowdsourcing helps people learn. One example of this is Duolingo, an app that offers free language lessons while simultaneously acting as a document translation service. “Why stop with language learning and translation?” they ask.
A similar approach could help people learn new skills as they work online, a process that should allow them to take on more complex roles. One example is in the field of radiology, where an important job is to recognize tumors on x-ray images. This is a task that machine vision algorithms do not yet perform reliably…..
Yet another idea would be to crowdsource information that helps the poorest families in America find social welfare programs. These programs are often difficult to navigate and represent a disproportionate hardship for the people who are most likely to benefit from them: those who are homeless, who have disabilities, who are on low income, and so on.
The idea is that the crowd should take on some of this burden freeing up this group for other tasks, like finding work, managing health problems and so on.
These are worthy goals but they raise some significant questions. Chief among these is the nature of the ethical, legal, and social implications of human computation. How can this work be designed to allow meaningful and dignified human participation? How can the outcomes be designed so that the most vulnerable people can benefit from it? And what is the optimal division of labor between machines and humans to produce a specific result?
Ref: arxiv.org/abs/1505.07096 : A U.S. Research Roadmap for Human Computation”