Explore our articles
View All Results

Stefaan Verhulst

DrexelNow: “…More than 40 percent of Philly nonprofit organizations operate on margins of zero or less, and fewer can be considered financially strong. With more than half of Philly’s nonprofits operating on a slim-to-none budget with limited support staff – one Drexel University researcher sought to help streamline their fundraising process by giving them easy access to data from the Internal Revenue Service and the U.S. Census. His goal: Create a tool that makes information about nonprofit organizations, and the communities they’re striving to help, more accessible to likeminded charities and the philanthropic organizations that seek to fund them.

When the IRS recently released millions of records on the finances and operations of nonprofit organizations in format that can be downloaded and analyzed, it was expected that this would usher in a new era of transparency and innovation for the nonprofit sector. Instead, many technical issues made the data virtually unusable by nonprofit organizations.

Single-page location intelligence tool: http://bit.ly/PhillyNPOs
Single-page location intelligence tool: http://bit.ly/PhillyNPOs

 Neville Vakharia, an assistant professor and research director in Drexel’s graduate Arts Administration program in the Westphal College of Media Arts & Design, tackled this issue by creating ImpactView Philadelphia, an online tool and resource that uses the publicly available data on nonprofit organizations to present an easy-to-access snapshot of Philadelphia’s nonprofit ecosystem.

Vakharia combined the publicly available data from the IRS with the most recent American Community Survey data released by the U.S. Census Bureau. These data were combined with a map of Philadelphia to create a visual database easily searchable by organization, address or zip code. Once an organization is selected, the analysis tools allow the user to see data on the map, alongside measures of households and individuals surrounding the organization — important information for nonprofits to have when they are applying for grants or looking for partners.

“Through the location intelligence visualizer, users can immediately find areas of need and potential collaborators. The data are automatically visualized and mapped on-screen, identifying, for example, pockets of high poverty with large populations of children as well as the nonprofit service providers in these areas,” said Vakharia. “Making this data accessible for nonprofits will cut down on time spent seeking information and improve the ability to make data-informed decisions, while also helping with case making and grant applications.”…(More)”.

A Tool to Help Nonprofits Find Each Other, Pursue Funding and Collaborate

Barbara Romzek and Aram Sinnreich at The Conversation: “…For years, watchdogs have been warning about sharing information with data-collecting companies, firms engaged in the relatively new line of business called some academics have called “surveillance capitalism.” Most casual internet users are only now realizing how easy – and common – it is for unaccountable and unknown organizations to assemble detailed digital profiles of them. They do this by combining the discrete bits of information consumers have given up to e-tailers, health sites, quiz apps and countless other digital services.

As scholars of public accountability and digital media systems, we know that the business of social media is based on extracting user data and offering it for sale. There’s no simple way for them to protect data as many users might expect. Like the social pollution of fake news, bullying and spam that Facebook’s platform spreads, the company’s privacy crisis also stems from a power imbalance: Facebook knows nearly everything about its users, who know little to nothing about it.

It’s not enough for people to delete their Facebook accounts. Nor is it likely that anyone will successfully replace it with a nonprofit alternativecentering on privacy, transparency and accountability. Furthermore, this problem is not specific just to Facebook. Other companies, including Google and Amazon, also gather and exploit extensive personal data, and are locked in a digital arms race that we believe threatens to destroy privacy altogether….

Governments need to be better guardians of public welfare – including privacy. Many companies using various aspects of technology in new ways have so far avoided regulation by stoking fears that rules might stifle innovation. Facebook and others have often claimed that they’re better at regulating themselves in an ever-changing environment than a slow-moving legislative process could be….

To encourage companies to serve democratic principles and focus on improving people’s lives, we believe the chief business model of the internet needs to shift to building trust and verifying information. While it won’t be an immediate change, social media companies pride themselves on their adaptability and should be able to take on this challenge.

The alternative, of course, could be far more severe. In the 1980s, when federal regulators decided that AT&T was using its power in the telephone market to hurt competition and consumers, they forced the massive conglomerate to break up. A similar but less dramatic change happened in the early 2000s when cellphone companies were forced to let people keep their phone numbers even if they switched carriers.

Data, and particularly individuals’ personal data, are the precious metals of the internet age. Protecting individual data while expanding access to the internet and its many social benefits is a fundamental challenge for free societies. Creating, using and protecting data properly will be crucial to preserving and improving human rights and civil liberties in this still young century. To meet this challenge will require both vigilance and vision, from businesses and their customers, as well as governments and their citizens….(More).

To serve a free society, social media must evolve beyond data mining

James Somers in The Atlantic: “The scientific paper—the actual form of it—was one of the enabling inventions of modernity. Before it was developed in the 1600s, results were communicated privately in letters, ephemerally in lectures, or all at once in books. There was no public forum for incremental advances. By making room for reports of single experiments or minor technical advances, journals made the chaos of science accretive. Scientists from that point forward became like the social insects: They made their progress steadily, as a buzzing mass.

The earliest papers were in some ways more readable than papers are today. They were less specialized, more direct, shorter, and far less formal. Calculus had only just been invented. Entire data sets could fit in a table on a single page. What little “computation” contributed to the results was done by hand and could be verified in the same way.

The more sophisticated science becomes, the harder it is to communicate results. Papers today are longer than ever and full of jargon and symbols. They depend on chains of computer programs that generate data, and clean up data, and plot data, and run statistical models on data. These programs tend to be both so sloppily written and so central to the results that it’s contributed to a replication crisis, or put another way, a failure of the paper to perform its most basic task: to report what you’ve actually discovered, clearly enough that someone else can discover it for themselves.

Perhaps the paper itself is to blame. Scientific methods evolve now at the speed of software; the skill most in demand among physicists, biologists, chemists, geologists, even anthropologists and research psychologists, is facility with programming languages and “data science” packages. And yet the basic means of communicating scientific results hasn’t changed for 400 years. Papers may be posted online, but they’re still text and pictures on a page.

What would you get if you designed the scientific paper from scratch today?…(More).

The Scientific Paper Is Obsolete

Handbook by the Government of New Zealand: “…helps you take a structured approach to using evidence at every stage of the policy and programme development cycle. Whether you work for central or local government, or the community and voluntary sector, you’ll find advice to help you:

  • understand different types and sources of evidence
  • know what you can learn from evidence
  • appraise evidence and rate its quality
  • decide how to select and use evidence to the best effect
  • take into account different cultural values and knowledge systems
  • be transparent about how you’ve considered evidence in your policy development work…(More)”

(See also Summary; This handbook is a companion to Making sense of evaluation: A handbook for everyone.).

Making sense of evidence: A guide to using evidence in policy

Report by AINow Institute: “Automated decision systems are currently being used by public agencies, reshaping how criminal justice systems work via risk assessment algorithms1 and predictive policing, optimizing energy use in critical infrastructure through AI-driven resource allocation, and changing our employment4 and educational systems through automated evaluation tools and matching algorithms.Researchers, advocates, and policymakers are debating when and where automated decision systems are appropriate, including whether they are appropriate at all in particularly sensitive domains.

Questions are being raised about how to fully assess the short and long term impacts of these systems, whose interests they serve, and if they are sufficiently sophisticated to contend with complex social and historical contexts. These questions are essential, and developing strong answers has been hampered in part by a lack of information and access to the systems under deliberation. Many such systems operate as “black boxes” – opaque software tools working outside the scope of meaningful scrutiny and accountability.8 This is concerning, since an informed policy debate is impossible without the ability to understand which existing systems are being used, how they are employed, and whether these systems cause unintended consequences. The Algorithmic Impact Assessment (AIA) framework proposed in this report is designed to support affected communities and stakeholders as they seek to assess the claims made about these systems, and to determine where – or if – their use is acceptable….

KEY ELEMENTS OF A PUBLIC AGENCY ALGORITHMIC IMPACT ASSESSMENT

1. Agencies should conduct a self-assessment of existing and proposed automated decision systems, evaluating potential impacts on fairness, justice, bias, or other concerns across affected communities;

2. Agencies should develop meaningful external researcher review processes to discover, measure, or track impacts over time;

3. Agencies should provide notice to the public disclosing their definition of “automated decision system,” existing and proposed systems, and any related self-assessments and researcher review processes before the system has been acquired;

4. Agencies should solicit public comments to clarify concerns and answer outstanding questions; and

5. Governments should provide enhanced due process mechanisms for affected individuals or communities to challenge inadequate assessments or unfair, biased, or otherwise harmful system uses that agencies have failed to mitigate or correct….(More)”.

Algorithmic Impact Assessment (AIA) framework

White Paper by the World Economic Forum: “For individuals, legal entities and devices alike, a verifiable and trusted identity is necessary to interact and transact with others.

The concept of identity isn’t new – for much of human history, we have used evolving credentials, from beads and wax seals to passports, ID cards and birth certificates, to prove who we are. The issues associated with identity proofing – fraud, stolen credentials and social exclusion – have challenged individuals throughout history. But, as the spheres in which we live and transact have grown, first geographically and now into the digital economy, the ways in which humans, devices and other entities interact are quickly evolving – and how we manage identity will have to change accordingly.

As we move into the Fourth Industrial Revolution and more transactions are conducted digitally, a digital representation of one’s identity has become increasingly important; this applies to humans, devices, legal entities and beyond. For humans, this proof of identity is a fundamental prerequisite to access critical services and participate in modern economic, social and political systems. For devices, their digital identity is critical in conducting transactions, especially as the devices will be able to transact relatively independent of humans in the near future. For legal entities, the current state of identity management consists of inefficient manual processes that could benefit from new technologies and architecture to support digital growth.

As the number of digital services, transactions and entities grows, it will be increasingly important to ensure the transactions take place in a secure and trusted network where each entity can be identified and authenticated. Identity is the first step of every transaction between two or more parties.

Over the ages, the majority of transactions between two identities has been mostly viewed in relation to the validation of a credential (“Is this genuine information?”), verification (“Does the information match the identity?”) and authentication of an identity (“Does this human/thing match the identity? Are you really who you claim to be?”). These questions have not changed over time, only the methods have change. This paper explores the challenges with current identity systems and the trends that will have significant impact on identity in the future….(More)”.

Digital Identity: On the Threshold of a Digital Identity Revolution

Working Paper by Gary King and Nathaniel Persily: “The mission of the academic social sciences is to understand and ameliorate society’s greatest challenges. The data held by private companies holds vast potential to further this mission. Yet, because of its interaction with highly politicized issues, customer privacy, proprietary content, and differing goals of firms and academics, these data are often inaccessible to university researchers.

We propose here a new model for industry-academic partnerships that addresses these problems via a novel organizational structure: Respected scholars form a commission which, as a trusted third party, receives access to all relevant firm information and systems, and then recruits independent academics to do research in specific areas following standard peer review protocols organized and funded by nonprofit foundations.

We also report on a partnership we helped forge under this model to make data available about the extremely visible and highly politicized issues surrounding the impact of social media on elections and democracy. In our partnership, Facebook will provide privacy-preserving data and access; seven major politically and substantively diverse nonprofit foundations will fund the research; and the Social Science Research Council will oversee the peer review process for funding and data access….(More)”.

A New Model for Industry-Academic Partnerships

Book edited by Barbara Kożuch, Sławomir J. Magala and Joanna Paliszkiewicz: “This book brings together the theory and practice of managing public trust. It examines the current state of public trust, including a comprehensive global overview of both the research and practical applications of managing public trust by presenting research from seven countries (Brazil, Finland, Poland, Hungary, Portugal, Taiwan, Turkey) from three continents. The book is divided into five parts, covering the meaning of trust, types, dimension and the role of trust in management; the organizational challenges in relation to public trust; the impact of social media on the development of public trust; the dynamics of public trust in business; and public trust in different cultural contexts….(More)”.

Managing Public Trust

Paper by Liz Richardson & Catherine Durose & Beth Perry in Politics and Governance: “There are many critiques of existing forms of urban governance as not fit for purpose. However, what alternatives might look like is equally contested. Coproduction is proposed as a response to address complex wicked issues. Achieving coproduction is a highly complex and daunting task. Bottom up approaches to the initiation of coproduced governance are seen as fruitful, including exemplification of utopian alternatives though local practices. New ways of seeing the role of conflict in participation are needed, including ways to institutionalise agonistic participatory practices. Coproduction in governance drives demands for forms of knowledge production that are themselves coproductive. New urban governing spaces need to be coproduced through participative transformation requiring experimentation and innovation in re-designing urban knowledge architectures. Future research in this field is proposed which is nuanced, grounded in explicit weightings of different democratic values, and which mediates between recognition of contingency and the ability to undertake comparative analysis….(More)”.

Coproducing Urban Governance

Michael Bernick at Forbes: “In January 2017, we the constituents of Wikimedia, started an ambitious discussion about our collective future. We reflected on our past sixteen years together and imagined the impact we could have in the world in the next decades. Our aim was to identify a common strategic direction that would unite and inspire people across our movement on our way to 2030, and help us make decisions.”…

The final documents included a strategic direction and a research report: “Wikimedia 2030: Wikimedia’s Role in Shaping the Future of the Information Commons”: an expansive look at Wikimedia, knowledge, technologies, and communications in the next decade. It includes thoughtful sections on Demographics (global population trends, and Wikimedia’s opportunities for growth) Emerging Platforms (how Wikimedia platforms will be accessed), Misinformation (how content creators and technologists can work toward a product that is trustworthy), Literacy (changing forms of learning that can benefit from the Wikimedia movement) and the core Wikimedia issues of Open Knowledge and knowledge as a service.

Among its goals, the document calls for greater outreach to areas outside of Europe and North America (which now account for 63% of Wikimedia’s total traffic), and widening the knowledge and experiential bases of contributors. It urges greater access through mobile devices and other emerging hardware; and expanding partnerships with libraries, museums, galleries and archives.

The document captures not only the idealism of the enterprise, and but also why Wikimedia can be described as a movement not only an enterprise. It calls into question conventional wisdoms of how our political and business structures should operate.

Consider the Wikimedia editing process that seeks to reach common ground on contentious issues. Lisa Gruwell, the Chief Advancement Officer of the Wikimedia Foundation, notes that in the development of an article, often editors with diverging claims and views will weigh in.  Rather than escalating divisions, the process of editing has been found to reduce these divisions. Gruwell explains,

Through the collaborative editing process, the editors have critical discussions about what reliable sources say about a topic. They have to engage and defend their own perspectives about how an article should be represented, and ultimately find some form of common ground with other editors.

A number of researchers at Harvard Business School led by Shane Greenstein, Yuan Gu and Feng Zhu actually set out to study this phenomenon. Their findings, published in 2017 as a Harvard Business School working paper found that editors with different political viewpoints tended to dialogue with each other, and over time reduce rather than increase partisanship….(More)”.

The Power Of The Wikimedia Movement Beyond Wikimedia

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday