Explore our articles

Stefaan Verhulst

Crowdjury is an online platform that crowdsources judicial proceedings: filing of complaints, evaluation of evidence, trial and jury verdict.

CrowdJury plataform
Fast, affordable, transparent justice

Crowdjury algorithms are optimized to reach a true verdict for each case, quickly and at minimal cost.

Do well by doing good

Jurors and Researchers from the crowd are rewarded in Bitcoin. You help do good, you earn money….

Want to know more? Read the White Paper “

Crowdjury

 at CNET: “Fighting the scourge of fake news online is one of the unexpected new crusades emerging from the fallout of Donald Trump’s upset presidential election win last week. Not surprisingly, the internet has no shortage of ideas for how to get its own house in order.

Eli Pariser, author of the seminal book “The Filter Bubble”that pre-saged some of the consequences of online platforms that tend to sequester users into non-overlapping ideological silos, is leading an inspired brainstorming effort via this open Google Doc.

Pariser put out the public call to collaborate via Twitter on Thursday and within 24 hours 21 pages worth of bullet-pointed suggestions has already piled up in the doc….

Suggestions ranged from the common call for news aggregators and social media platforms to hire more human editors, to launching more media literacy programs or creating “credibility scores” for shared content and/or users who share or report fake news.

Many of the suggestions are aimed at Facebook, which has taken a heavy heaping of criticism since the election and a recent report that found the top fake election news stories saw more engagement on Facebook than the top real election stories….

In addition to the crowdsourced brainstorming approach, plenty of others are chiming in with possible solutions. Author, blogger and journalism professor Jeff Jarvis teamed up with entrepreneur and investor John Borthwick of Betaworks to lay out 15 concrete ideas for addressing fake news on Medium ….The Trust Project at Santa Clara University is working to develop solutions to attack fake news that include systems for author verification and citations….(More)

The internet is crowdsourcing ways to drain the fake news swamp

Trends in Big Data Research, a Sage Whitepaper: “Information of all kinds is now being produced, collected, and analyzed at unprecedented speed, breadth, depth, and scale. The capacity to collect and analyze massive data sets has already transformed fields such as biology, astronomy, and physics, but the social sciences have been comparatively slower to adapt, and the path forward is less certain. For many, the big data revolution promises to ask, and answer, fundamental questions about individuals and collectives, but large data sets alone will not solve major social or scientific problems. New paradigms being developed by the emerging field of “computational social science” will be needed not only for research methodology, but also for study design and interpretation, cross-disciplinary collaboration, data curation and dissemination, visualization, replication, and research ethics (Lazer et al., 2009). SAGE Publishing conducted a survey with social scientists around the world to learn more about researchers engaged in big data research and the challenges they face, as well as the barriers to entry for those looking to engage in this kind of research in the future. We were also interested in the challenges of teaching computational social science methods to students. The survey was fully completed by 9412 respondents, indicating strong interest in this topic among our social science contacts. Of respondents, 33 percent had been involved in big data research of some kind and, of those who have not yet engaged in big data research, 49 percent (3057 respondents) said that they are either “definitely planning on doing so in the future” or “might do so in the future.”…(More)”

Who Is Doing Computational Social Science?

Theme issue of Phil. Trans. R. Soc. A compiled and edited by Mariarosaria Taddeo and Luciano Floridi: “This theme issue has the founding ambition of landscaping data ethics as a new branch of ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values). Data ethics builds on the foundation provided by computer and information ethics but, at the same time, it refines the approach endorsed so far in this research field, by shifting the level of abstraction of ethical enquiries, from being information-centric to being data-centric. This shift brings into focus the different moral dimensions of all kinds of data, even data that never translate directly into information but can be used to support actions or generate behaviours, for example. It highlights the need for ethical analyses to concentrate on the content and nature of computational operations—the interactions among hardware, software and data—rather than on the variety of digital technologies that enable them. And it emphasizes the complexity of the ethical challenges posed by data science. Because of such complexity, data ethics should be developed from the start as a macroethics, that is, as an overall framework that avoids narrow, ad hoc approaches and addresses the ethical impact and implications of data science and its applications within a consistent, holistic and inclusive framework. Only as a macroethics will data ethics provide solutions that can maximize the value of data science for our societies, for all of us and for our environments….(More)”

Table of Contents:

  • The dynamics of big data and human rights: the case of scientific research; Effy Vayena, John Tasioulas
  • Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue; Sebastian Porsdam Mann, Julian Savulescu, Barbara J. Sahakian
  • Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions; Luciano Floridi
  • Compelling truth: legal protection of the infosphere against big data spills; Burkhard Schafer
  • Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems; Sabina Leonelli
  • Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy; Deirdre K. Mulligan, Colin Koopman, Nick Doty
  • Beyond privacy and exposure: ethical issues within citizen-facing analytics; Peter Grindrod
  • The ethics of smart cities and urban science; Rob Kitchin
  • The ethics of big data as a public good: which public? Whose good? Linnet Taylor
  • Data philanthropy and the design of the infraethics for information societies; Mariarosaria Taddeo
  • The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics; Olivia Varley-Winter, Hetan Shah
  • Data science ethics in government; Cat Drew
  • The ethics of data and of data science: an economist’s perspective; Jonathan Cave
  • What’s the good of a science platform? John Gallacher

 

The ethical impact of data science

Essay by Martin Tisné: ” The anti-corruption and transparency field ten years ago was in pre-iPhone mode. Few if any of us spoke of the impact or relevance of technology to what would become known as the open government movement. When the wave of smart phone and other technology hit from the late 2000s onwards, it hit hard, and scaled fast. The ability of technology to create ‘impact at scale’ became the obvious truism of our sector, so much so that pointing out the failures of techno-utopianism became a favorite pastime for pundits and academics. The technological developments of the next ten years will be more human-centered — less ‘build it and they will come’ — and more aware of the un-intended consequences of technology (e.g. the fairness of Artifical Intelligence decision making) whilst still being deeply steeped in the technology itself.

By 2010, two major open data initiatives had launched and were already seen as successful in the US and UK, one of President Obama’s first memorandums was on openness and transparency, and an international research project had tracked 63 different instances of uses of technology for transparency around the world (from Reclamos in Chile, to I Paid a Bribe in India, via Maji Matone in Tanzania). Open data projects numbered over 200 world-wide within barely a year of data.gov.uk launching and to everyone’s surprise topped the list of Open Government Partnership commitments a few years hence.

The technology genie won’t go back into the bottle: the field will continue to grow alongside technological developments. But it would take a bold or foolish pundit to guess which of blockchain or other developments will have radically changed the field by 2025.

What is clearer is that the sector is more questioning towards technology, more human-centered both in the design of those technologies and in seeking to understand and pre-empt their impact….

We’ve moved from cyber-utopianism less than ten years ago to born-digital organisations taking a much more critical look at the deployment of technology. The evangelical phase of the open data movement is coming to an end. The movement no longer needs to preach the virtues of unfettered openness to get a foot in the door. It seeks to frame the debate as to whether, when and how data might legitimately be shared or closed, and what impacts those releases may have on privacy, surveillance, discrimination. An open government movement that is more human-centered and aware of the un-intended consequences of technology, has a bright and impactful future ahead….(More)”

From Tech-Driven to Human-Centred: Opengov has a Bright Future Ahead
Fact Sheet by The White House on “Establishing a Council on Community Solutions to Align Federal Efforts with Local Priorities and Citizens’ Needs”: “Today, building on the Administration’s efforts to modernize the way the Federal Government works with cities, counties, and communities — rural, tribal, urban, and sub-urban – the President signed an Executive Order establishing a Community Solutions Council. The Council will provide a lasting structure for Federal agencies to strengthen partnerships with communities and improve coordination across the Federal Government in order to more efficiently deliver assistance and maximize impact.

Across the country, citizens and local leaders need a Federal Government that is more effective, responsive, and collaborative in addressing their needs and challenges. Far too often, the Federal Government has taken a “one-size-fits-all” approach to working with communities and left local leaders on their own to find Federal resources and navigate disparate programs. Responding to the call for change from local officials and leaders nationwide, and grounded in the belief that the best solutions come from the bottom up, not from the top down, Federal agencies have increasingly taken on a different approach to working with communities to deliver better outcomes in more than 1,800 cities, towns, regions, and tribal communities nationwide.

As a part of this new way of working, Federal agencies are partnering with local officials to support local plans and visions. They are crossing agency and program silos to support cities, towns, counties and tribes in implementing locally-developed plans for improvement – from re-lighting city streets to breathing new life into half-empty rural main streets.  And by using data to measure success and harnessing technology, Federal agencies are focusing on community-driven solutions and what works, while monitoring progress to make investments that have a strong base of evidence behind them.

Building on this success, the President today signed an Executive Order (EO) that will continue to make government work better for the American people. The EO establishes a Council for Community Solutions to streamline and improve the way the Federal Government works with cities, counties, and communities – rural, tribal, urban and sub-urban – to improve outcomes. The Council includes leadership from agencies, departments and offices across the Federal Government and the White House, who together will develop and implement policy that puts local priorities first, highlights successful solutions based on best practices, and streamlines Federal support for communities.  Further, the Council, where appropriate, will engage with representatives and leaders of organizations, businesses and communities to expand and improve partnerships that address the most pressing challenges communities face….

  • Harnessing Data and Technology to Improve Outcomes for Communities: The Federal government is working to foster collaborations between communities and the tech sector, non-profits and citizens to help communities develop new ways to use both Federal and local data to address challenges with greater precision and innovation. As a result, new digital tools are helping citizens find affordable housing near jobs and transportation, matching unemployed Americans with jobs that meet their skills, enabling local leaders to use data to better target investments, and more…(More)”
Council on Community Solutions

Madolyn Smith at Data Driven Journalism: “impactAFRICA, the continent’s largest fund for data driven storytelling, has announced the winners of its water and sanitation contest. Journalists from Ghana, Nigeria, Kenya, Tanzania, South Africa and Zambia made waves with their stories, but three in particular stood out against the tide.

1. South Africa All At Sea

Sipho Kings‘ story on illegal fishing along South Africa’s coast for the Mail & Guardian shows how data from nanosatellites could solve the tricky problem of tracking illegal activities….

As well as providing a data driven solution to South Africa’s problem, this story has been credited with prompting increased naval patrols, which has uncovered a string of illegal fishing trawlers.

Read the story here.

2. Water Data for Nigeria

This tool, developed by Abiri Oluwatosin Niyi for CMapIT, tracks the supply and consumption of water in Nigeria. To combat a scarcity of data on public water resources, the project crowdsources data from citizens and water point operators. Data is updated in real-time and can be explored via an interactive map.

nigeria.PNG

Image: Water Data for Nigeria.

In addition, the underlying data is also available for free download and reuse.

Explore the project here.

3. Ibadan: A City of Deep Wells and Dry Taps

Writing for the International Centre for Investigative Reporting, Kolawole Talabi demonstrates a relationship between declining oil revenues and government water expenditure in Ibadan, Nigeria’s third largest city, with detrimental impacts on its inhabitants health.

The investigation draws on data from international organisations, like UNICEF, and government budgetary allocations, as well as qualitative interview data.

Following the story’s publication, there has been extensive online debate and numerous calls for governmental action.

Read the story here….(More)”

3 Ways data has made a splash in Africa

Think Piece by Heather Grabbe for ESPAS 2016 conference: ” In many parts of everyday life, voters are used to a consumer experience where they get instant feedback and personal participation; but party membership, ballot boxes and stump speeches do not offer the same speed, control or personal engagement. The institutions of representative democracy at national and EU level — political parties, elected members, law-making — do not offer the same quality of experience for their ultimate consumers.

This matters because it is causing voters to switch off. Broad participation by most of the population in the practice of democracy is vital for societies to remain open because it ensures pluralism and prevents takeover of power by narrow interests. But in some countries and some elections, turnout is regularly below a third of registered voters, especially in European Parliament elections.

The internet is driving the major trends that create this disconnection and disruption. Here are four vital areas in which politics should adapt, including at EU level:

  • Expectation. Voters have a growing sense that political parties and law-making are out of touch, but not that politics is irrelevant. …
  • Affiliation. … people are interested in new forms of affiliation, especially through social media and alternative networks. …
  • Location. Digital technology allows people to find myriad new ways to express their political views publicly, outside of formal political spaces. …
  • Information. The internet has made vast amounts of data and a huge range of information sources across an enormous spectrum of issues available to every human with an internet connection. How is this information overload affecting engagement with politics? ….(More)”
Between Governance of the Past and Technology of the Future

Chapter by Zoe Baird in America’s National Security Architecture: Rebuilding the Foundation: “The private sector is transforming at record speed for the digital economy. As recently as 2008, when America elected President Obama, most large companies had separate IT departments, which were seen as just that—departments—separate from the heart of the business. Now, as wireless networks connect the planet, and entire companies exist in the cloud, digital technology is no longer viewed as another arrow in the corporate quiver, but rather the very foundation upon which all functions are built. This, then, is the mark of the digital era: in order to remain successful, modern enterprises must both leverage digital technology and develop a culture that values its significance within the organization.

For the federal government to help all Americans thrive in this new economy, and for the government to be an engine of growth, it too must enter the digital era. On a basic level, we need to improve the government’s digital infrastructure and use technology to deliver government services better. But a government for the digital economy needs to take bold steps to embed these actions as part of a large and comprehensive transformation in how it goes about the business of governing. We should not only call on the “IT department” to provide tools, we must completely change the way we think about how a digital age government learns about the world, makes policy, and operates against its objectives.

Government today does not reflect the fundamental attributes of the digital age. It moves slowly at a time when information travels around the globe at literally the speed of light. It takes many years to develop and implement comprehensive policy in a world characterized increasingly by experimentation and iterative midcourse adjustments. It remains departmentally balkanized and hierarchical in an era of networks and collaborative problem solving. It assumes that it possesses the expertise necessary to make decisions while most of the knowledge resides at the edges. It is bogged down in legacy structures and policy regimes that do not take advantage of digital tools, and worse, create unnecessary barriers that hold progress back. Moreover, it is viewed by its citizens as opaque and complex in an era when openness and access are attributes of legitimacy….(More)”

Government for a Digital Economy

James Bridle in the New Humanist: “In a 2008 article in Wired magazine entitled “The End of Theory”, Chris Anderson argued that the vast amounts of data now available to researchers made the traditional scientific process obsolete. No longer would they need to build models of the world and test them against sampled data. Instead, the complexities of huge and totalising datasets would be processed by immense computing clusters to produce truth itself: “With enough data, the numbers speak for themselves.” As an example, Anderson cited Google’s translation algorithms which, with no knowledge of the underlying structures of languages, were capable of inferring the relationship between them using extensive corpora of translated texts. He extended this approach to genomics, neurology and physics, where scientists are increasingly turning to massive computation to make sense of the volumes of information they have gathered about complex systems. In the age of big data, he argued, “Correlation is enough. We can stop looking for models.”

This belief in the power of data, of technology untrammelled by petty human worldviews, is the practical cousin of more metaphysical assertions. A belief in the unquestionability of data leads directly to a belief in the truth of data-derived assertions. And if data contains truth, then it will, without moral intervention, produce better outcomes. Speaking at Google’s private London Zeitgeist conference in 2013, Eric Schmidt, Google Chairman, asserted that “if they had had cellphones in Rwanda in 1994, the genocide would not have happened.” Schmidt’s claim was that technological visibility – the rendering of events and actions legible to everyone – would change the character of those actions. Not only is this statement historically inaccurate (there was plenty of evidence available of what was occurring during the genocide from UN officials, US satellite photographs and other sources), it’s also demonstrably untrue. Analysis of unrest in Kenya in 2007, when over 1,000 people were killed in ethnic conflicts, showed that mobile phones not only spread but accelerated the violence. But you don’t need to look to such extreme examples to see how a belief in technological determinism underlies much of our thinking and reasoning about the world.

“Big data” is not merely a business buzzword, but a way of seeing the world. Driven by technology, markets and politics, it has come to determine much of our thinking, but it is flawed and dangerous. It runs counter to our actual findings when we employ such technologies honestly and with the full understanding of their workings and capabilities. This over-reliance on data, which I call “quantified thinking”, has come to undermine our ability to reason meaningfully about the world, and its effects can be seen across multiple domains.

The assertion is hardly new. Writing in the Dialectic of Enlightenment in 1947, Theodor Adorno and Max Horkheimer decried “the present triumph of the factual mentality” – the predecessor to quantified thinking – and succinctly analysed the big data fallacy, set out by Anderson above. “It does not work by images or concepts, by the fortunate insights, but refers to method, the exploitation of others’ work, and capital … What men want to learn from nature is how to use it in order wholly to dominate it and other men. That is the only aim.” What is different in our own time is that we have built a world-spanning network of communication and computation to test this assertion. While it occasionally engenders entirely new forms of behaviour and interaction, the network most often shows to us with startling clarity the relationships and tendencies which have been latent or occluded until now. In the face of the increased standardisation of knowledge, it becomes harder and harder to argue against quantified thinking, because the advances of technology have been conjoined with the scientific method and social progress. But as I hope to show, technology ultimately reveals its limitations….

“Eroom’s law” – Moore’s law backwards – was recently formulated to describe a problem in pharmacology. Drug discovery has been getting more expensive. Since the 1950s the number of drugs approved for use in human patients per billion US dollars spent on research and development has halved every nine years. This problem has long perplexed researchers. According to the principles of technological growth, the trend should be in the opposite direction. In a 2012 paper in Nature entitled “Diagnosing the decline in pharmaceutical R&D efficiency” the authors propose and investigate several possible causes for this. They begin with social and physical influences, such as increased regulation, increased expectations and the exhaustion of easy targets (the “low hanging fruit” problem). Each of these are – with qualifications – disposed of, leaving open the question of the discovery process itself….(More)

What’s wrong with big data?

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday