Embracing Innovation in Government Global Trends


Report by the OECD: “Innovation in government is about finding new ways to impact the lives of citizens, and new approaches to activating them as partners to shape the future together. It involves overcoming old structures and modes of thinking and embracing new technologies and ideas. The potential of innovation in government is immense; however, the challenges governments face are significant. Despite this, governments are transforming the way they work to ensure this potential is met….

Since 2014, the OECD Observatory of Public Sector Innovation (OPSI), an OECD Directorate for Public Governance and Territorial Development (GOV) initiative, has been working to identify the key issues for innovation in government and what can be done to achieve greater impact. To learn from governments on the leading edge of this field, OPSI has partnered with the Government of the United Arab Emirates (UAE) and its Mohammed Bin Rashid Centre for Government Innovation (MBRCGI) , as part of the Middle East and North Africa (MENA)-OECD Governance Programme, to conduct a global review of new ways in which governments are transforming their operations and improving the lives of their people, culminating in this report.

Through research and an open Call for Innovations, the review surfaces key trends, challenges, and success factors in innovation today, as well as examples and case studies to illustrate them and recommendations to help support innovation. This report is published in conjunction with the 2017 World Government Summit, which brings together over 100 countries to discuss innovative ways to solve the challenges facing humanity….(More)”

Big data may be reinforcing racial bias in the criminal justice system


Laurel Eckhouse at the Washington Post: “Big data has expanded to the criminal justice system. In Los Angeles, police use computerized “predictive policing” to anticipate crimes and allocate officers. In Fort Lauderdale, Fla., machine-learning algorithms are used to set bond amounts. In states across the country, data-driven estimates of the risk of recidivism are being used to set jail sentences.

Advocates say these data-driven tools remove human bias from the system, making it more fair as well as more effective. But even as they have become widespread, we have little information about exactly how they work. Few of the organizations producing them have released the data and algorithms they use to determine risk.

 We need to know more, because it’s clear that such systems face a fundamental problem: The data they rely on are collected by a criminal justice system in which race makes a big difference in the probability of arrest — even for people who behave identically. Inputs derived from biased policing will inevitably make black and Latino defendants look riskier than white defendants to a computer. As a result, data-driven decision-making risks exacerbating, rather than eliminating, racial bias in criminal justice.
Consider a judge tasked with making a decision about bail for two defendants, one black and one white. Our two defendants have behaved in exactly the same way prior to their arrest: They used drugs in the same amount, have committed the same traffic offenses, owned similar homes and took their two children to the same school every morning. But the criminal justice algorithms do not rely on all of a defendant’s prior actions to reach a bail assessment — just those actions for which he or she has been previously arrested and convicted. Because of racial biases in arrest and conviction rates, the black defendant is more likely to have a prior conviction than the white one, despite identical conduct. A risk assessment relying on racially compromised criminal-history data will unfairly rate the black defendant as riskier than the white defendant.

To make matters worse, risk-assessment tools typically evaluate their success in predicting a defendant’s dangerousness on rearrests — not on defendants’ overall behavior after release. If our two defendants return to the same neighborhood and continue their identical lives, the black defendant is more likely to be arrested. Thus, the tool will falsely appear to predict dangerousness effectively, because the entire process is circular: Racial disparities in arrests bias both the predictions and the justification for those predictions.

We know that a black person and a white person are not equally likely to be stopped by police: Evidence on New York’s stop-and-frisk policy, investigatory stops, vehicle searches and drug arrests show that black and Latino civilians are more likely to be stopped, searched and arrested than whites. In 2012, a white attorney spent days trying to get himself arrested in Brooklyn for carrying graffiti stencils and spray paint, a Class B misdemeanor. Even when police saw him tagging the City Hall gateposts, they sped past him, ignoring a crime for which 3,598 people were arrested by the New York Police Department the following year.

Before adopting risk-assessment tools in the judicial decision-making process, jurisdictions should demand that any tool being implemented undergo a thorough and independent peer-review process. We need more transparencyand better data to learn whether these risk assessments have disparate impacts on defendants of different races. Foundations and organizations developing risk-assessment tools should be willing to release the data used to build these tools to researchers to evaluate their techniques for internal racial bias and problems of statistical interpretation. Even better, with multiple sources of data, researchers could identify biases in data generated by the criminal justice system before the data is used to make decisions about liberty. Unfortunately, producers of risk-assessment tools — even nonprofit organizations — have not voluntarily released anonymized data and computational details to other researchers, as is now standard in quantitative social science research….(More)”.

Managing for Social Impact: Innovations in Responsible Enterprise


Book edited by Mary J, Cronin and , Tiziana C. Dearing: “This book presents innovative strategies for sustainable, socially responsible enterprise management from leading thinkers in the fields of corporate citizenship, nonprofit management, social entrepreneurship, impact investing, community-based economic development and urban design. The book’s integration of research and practitioner perspectives with focused best practice examples offers an in-depth, balanced analysis, providing new insights into the social issues that are most relevant to organizational stakeholders. This integrated focus on sustainable social innovation differentiates the book from academic research monographs on stakeholder theory and practitioner guides to managing traditional Corporate Social Responsibility (CSR) programs.

Managing for Social Impact features 15 contributed chapters written by thought leaders, industry analysts, and managers of global and local organizations who are engaged with innovative models of sustainable social impact. The editors also provide a substantive introductory chapter describing a new strategic framework for enhancing the Return on Social Innovation (ROSI) through four pillars of social change: Open Circles, Focused Purpose Sharing, Mutuality of Success, and a Persistent Change Perspective….(More)”.

How to Do Social Science Without Data


Neil Gross in the New York Times: With the death last month of the sociologist Zygmunt Bauman at age 91, the intellectual world lost a thinker of rare insight and range. Because his style of work was radically different from that of most social scientists in the United States today, his passing is an occasion to consider what might be gained if more members of our profession were to follow his example….

Weber saw bureaucracies as powerful, but dispiritingly impersonal. Mr. Bauman amended this: Bureaucracy can be inhuman. Bureaucratic structures had deadened the moral sense of ordinary German soldiers, he contended, which made the Holocaust possible. They could tell themselves they were just doing their job and following orders.

Later, Mr. Bauman turned his scholarly attention to the postwar and late-20th-century worlds, where the nature and role of all-encompassing institutions were again his focal point. Craving stability after the war, he argued, people had set up such institutions to direct their lives — more benign versions of Weber’s bureaucracy. You could go to work for a company at a young age and know that it would be a sheltering umbrella for you until you retired. Governments kept the peace and helped those who couldn’t help themselves. Marriages were formed through community ties and were expected to last.

But by the end of the century, under pressure from various sources, those institutions were withering. Economically, global trade had expanded, while in Europe and North America manufacturing went into decline; job security vanished. Politically, too, changes were afoot: The Cold War drew to an end, Europe integrated and politicians trimmed back the welfare state. Culturally, consumerism seemed to pervade everything. Mr. Bauman noted major shifts in love and intimacy as well, including a growing belief in the contingency of marriage and — eventually — the popularity of online dating.

In Mr. Bauman’s view, it all connected. He argued we were witnessing a transition from the “solid modernity” of the mid-20th century to the “liquid modernity” of today. Life had become freer, more fluid and a lot more risky. In principle, contemporary workers could change jobs whenever they got bored. They could relocate abroad or reinvent themselves through shopping. They could find new sexual partners with the push of a button. But there was little continuity.

Mr. Bauman considered the implications. Some thrived in this new atmosphere; the institutions and norms previously in place could be stultifying, oppressive. But could a transient work force come together to fight for a more equitable distribution of resources? Could shopping-obsessed consumers return to the task of being responsible, engaged citizens? Could intimate partners motivated by short-term desire ever learn the value of commitment?…(More)”

Citizen Empowerment and Innovation in the Data-Rich City


Book edited by C. Certomà, M. Dyer, L. Pocatilu and F. Rizzi: “… analyzes the ongoing transformation in the “smart city” paradigm and explores the possibilities that technological innovations offer for the effective involvement of ordinary citizens in collective knowledge production and decision-making processes within the context of urban planning and management. To so, it pursues an interdisciplinary approach, with contributions from a range of experts including city managers, public policy makers, Information and Communication Technology (ICT) specialists, and researchers. The first two parts of the book focus on the generation and use of data by citizens, with or without institutional support, and the professional management of data in city governance, highlighting the social connectivity and livability aspects essential to vibrant and healthy urban environments. In turn, the third part presents inspiring case studies that illustrate how data-driven solutions can empower people and improve urban environments, including enhanced sustainability. The book will appeal to all those who are interested in the required transformation in the planning, management, and operations of data-rich cities and the ways in which such cities can employ the latest technologies to use data efficiently, promoting data access, data sharing, and interoperability….(More)”.

Code-Dependent: Pros and Cons of the Algorithm Age


 and  at PewResearch Center: “Algorithms are instructions for solving a problem or completing a task. Recipes are algorithms, as are math equations. Computer code is algorithmic. The internet runs on algorithms and all online searching is accomplished through them. Email knows where to go thanks to algorithms. Smartphone apps are nothing but algorithms. Computer and video games are algorithmic storytelling. Online dating and book-recommendation and travel websites would not function without algorithms. GPS mapping systems get people from point A to point B via algorithms. Artificial intelligence (AI) is naught but algorithms. The material people see on social media is brought to them by algorithms. In fact, everything people see and do on the web is a product of algorithms. Every time someone sorts a column in a spreadsheet, algorithms are at play, and most financial transactions today are accomplished by algorithms. Algorithms help gadgets respond to voice commands, recognize faces, sort photos and build and drive cars. Hacking, cyberattacks and cryptographic code-breaking exploit algorithms. Self-learning and self-programming algorithms are now emerging, so it is possible that in the future algorithms will write many if not most algorithms.

Algorithms are often elegant and incredibly useful tools used to accomplish tasks. They are mostly invisible aids, augmenting human lives in increasingly incredible ways. However, sometimes the application of algorithms created with good intentions leads to unintended consequences. Recent news items tie to these concerns:

A City Is Not a Computer


 at Places Journal: “…Modernity is good at renewing metaphors, from the city as machine, to the city as organism or ecology, to the city as cyborgian merger of the technological and the organic. Our current paradigm, the city as computer, appeals because it frames the messiness of urban life as programmable and subject to rational order. Anthropologist Hannah Knox explains, “As technical solutions to social problems, information and communications technologies encapsulate the promise of order over disarray … as a path to an emancipatory politics of modernity.” And there are echoes of the pre-modern, too. The computational city draws power from an urban imaginary that goes back millennia, to the city as an apparatus for record-keeping and information management.

We’ve long conceived of our cities as knowledge repositories and data processors, and they’ve always functioned as such. Lewis Mumford observed that when the wandering rulers of the European Middle Ages settled in capital cities, they installed a “regiment of clerks and permanent officials” and established all manner of paperwork and policies (deeds, tax records, passports, fines, regulations), which necessitated a new urban apparatus, the office building, to house its bureaus and bureaucracy. The classic example is the Uffizi (Offices) in Florence, designed by Giorgio Vasari in the mid-16th century, which provided an architectural template copied in cities around the world. “The repetitions and regimentations of the bureaucratic system” — the work of data processing, formatting, and storage — left a “deep mark,” as Mumford put it, on the early modern city.

Yet the city’s informational role began even earlier than that. Writing and urbanization developed concurrently in the ancient world, and those early scripts — on clay tablets, mud-brick walls, and landforms of various types — were used to record transactions, mark territory, celebrate ritual, and embed contextual information in landscape.  Mumford described the city as a fundamentally communicative space, rich in information:

Through its concentration of physical and cultural power, the city heightened the tempo of human intercourse and translated its products into forms that could be stored and reproduced. Through its monuments, written records, and orderly habits of association, the city enlarged the scope of all human activities, extending them backwards and forwards in time. By means of its storage facilities (buildings, vaults, archives, monuments, tablets, books), the city became capable of transmitting a complex culture from generation to generation, for it marshaled together not only the physical means but the human agents needed to pass on and enlarge this heritage. That remains the greatest of the city’s gifts. As compared with the complex human order of the city, our present ingenious electronic mechanisms for storing and transmitting information are crude and limited.

Mumford’s city is an assemblage of media forms (vaults, archives, monuments, physical and electronic records, oral histories, lived cultural heritage); agents (architectures, institutions, media technologies, people); and functions (storage, processing, transmission, reproduction, contextualization, operationalization). It is a large, complex, and varied epistemological and bureaucratic apparatus. It is an information processor, to be sure, but it is also more than that.

Were he alive today, Mumford would reject the creeping notion that the city is simply the internet writ large. He would remind us that the processes of city-making are more complicated than writing parameters for rapid spatial optimization. He would inject history and happenstance. The city is not a computer. This seems an obvious truth, but it is being challenged now (again) by technologists (and political actors) who speak as if they could reduce urban planning to algorithms.

Why should we care about debunking obviously false metaphors? It matters because the metaphors give rise to technical models, which inform design processes, which in turn shape knowledges and politics, not to mention material cities. The sites and systems where we locate the city’s informational functions — the places where we see information-processing, storage, and transmission “happening” in the urban landscape — shape larger understandings of urban intelligence….(More)”

The value of crowdsourcing in public policymaking: epistemic, democratic and economic value


 &  in The Theory and Practice of Legislation: “While national and local governments increasingly deploy crowdsourcing in lawmaking as an open government practice, it remains unclear how crowdsourcing creates value when it is applied in policymaking. Therefore, in this article, we examine value creation in crowdsourcing for public policymaking. We introduce a framework for analysing value creation in public policymaking in the following three dimensions: democratic, epistemic and economic. Democratic value is created by increasing transparency, accountability, inclusiveness and deliberation in crowdsourced policymaking. Epistemic value is developed when crowdsourcing serves as a knowledge search mechanism and a learning context. Economic value is created when crowdsourcing makes knowledge search in policymaking more efficient and enables government to produce policies that better address citizens’ needs and societal issues. We show how these tenets of value creation are manifest in crowdsourced policymaking by drawing on instances of crowdsourced lawmaking, and we also discuss the contingencies and challenges preventing value creation…(More)”

Toward a User-Centered Social Sector


Tris Lumley at Stanford Social Innovation Review: “Over the last three years, a number of trends have crystallized that I believe herald the promise of a new phase—perhaps even a new paradigm—for the social sector. I want to explore three of the most exciting, and sketch out where I believe they might take us and why we’d all do well to get involved.

  • The rise of feedback
  • New forms of collaboration
  • Disruption through technology

Taken individually, these three themes are hugely significant in their potential impact on the work of nonprofits and those that invest in them. But viewed together, as interwoven threads, I believe they have the potential to transform both how we work and the underlying fundamental incentives and structure of the social sector.

The rise of feedback

The nonprofit sector is built on a deep and rich history of community engagement. Yet, in a funding market that incentivizes accountability to funders, this strong tradition of listening, engagement, and ownership by primary constituents—the people and communities nonprofits exist to serve—has sometimes faded. Opportunities for funding can drive strategies. Practitioner experience and research evidence can shape program designs. Engagement with service users can become tokenistic, or shallow….

In recognition of this growing momentum, Keystone Accountability and New Philanthropy Capital (NPC) published a paper in 2016 to explore the relationship between impact measurement and user voice. It is our shared belief that many of the recent criticisms of the impact movement—such as impact reporting being used primarily for fundraising rather than improving programs—would be addressed if impact evidence and user voice were seen as two sides of the same coin, and we more routinely sought to synthesize our understanding of nonprofits’ programs from both aspects at once…

New forms of collaboration

As recent critiques of collective impact have pointed out, the social sector has a long history of collaboration. Yet it has not always been the default operating model of nonprofits or their funders. The fragmented nature of the social sector today exposes an urgent imperative for greater focus on collaboration….

Yet the need for greater collaboration and new forms to incentivize and enable it is increasing. Deepening austerity policies, the shrinking of the state in many countries, and the sheer scale of the social issues we face have driven the “demand” side of collaboration. The collective impact movement has certainly been one driver of momentum on the “supply” side, and a number of other forms of collaboration are emerging.

The Young People’s Foundation model, developed in the UK by the John Lyons Charity, is one response to deepening cuts in nonprofit funding. Young People’s Foundations are new organizations that serve three purposes for nonprofits working with young people in the local area—creating a network, leading on collaborative funding bids and contracting processes, and sharing assets across the network.

Elsewhere, philanthropic donors and foundations are increasingly exploring collaboration in practical terms, through pooled grant funds that provide individual donors unrivalled leverage, and that allow groups of funders to benefit from each other’s strengths through coordination and shared strategies. The Dasra Girl Alliance in India is an example of a pooled fund that brings together philanthropic donors and institutional development funders, and fosters collaboration between the nonprofits it supports….

Disruption through technology

Technology might appear an incongruous companion to feedback and collaboration, which are both very human in nature, yet it’s likely to transform our sector….(More)”

State of Open Corporate Data: Wins and Challenges Ahead


Sunlight Foundation: “For many people working to open data and reduce corruption, the past year could be summed up in two words: “Panama Papers.” The transcontinental investigation by a team from International Center of Investigative Journalists (ICIJ) blew open the murky world of offshore company registration. It put corporate transparency high on the agenda of countries all around the world and helped lead to some notable advances in access to official company register data….

While most companies are created and operated for legitimate economic activity,  there is a small percentage that aren’t. Entities involved in corruption, money laundering, fraud and tax evasion frequently use such companies as vehicles for their criminal activity. “The Idiot’s Guide to Money Laundering from Global Witness” shows how easy it is to use layer after layer of shell companies to hide the identity of the person who controls and benefits from the activities of the network. The World Bank’s “Puppet Masters” report found that over 70% of grand corruption cases, in fact, involved the use of offshore vehicles.

For years, OpenCorporates has advocated for company information to be in the public domain as open data, so it is usable and comparable.  It was the public reaction to Panama Papers, however, that made it clear that due diligence requires global data sets and beneficial registries are key for integrity and progress.

The call for accountability and action was clear from the aftermath of the leak. ICIJ, the journalists involved and advocates have called for tougher action on prosecutions and more transparency measures: open corporate registers and beneficial ownership registers. A series of workshops organized by the B20 showed that business also needed public beneficial ownership registers….

Last year the UK became the first country in the world to collect and publish who controls and benefits from companies in a structured format, and as open data. Just a few days later, we were able to add the information in OpenCorporates. The UK data, therefore, is one of a kind, and has been highly anticipated by transparency skeptics and advocates advocates alike. So fa,r things are looking good. 15 other countries have committed to having a public beneficial ownership register including Nigeria, Afghanistan, Germany, Indonesia, New Zealand and Norway. Denmark has announced its first public beneficial ownership data will be published in June 2017. It’s likely to be open data.

This progress isn’t limited to beneficial ownership. It is also being seen in the opening up of corporate registers . These are what OpenCorporates calls “core company data”. In 2016, more countries started releasing company register as open data, including Japan, with over 4.4 million companies, IsraelVirginiaSloveniaTexas, Singapore and Bulgaria. We’ve also had a great start to 2017 , with France publishing their central company database as open data on January 5th.

As more states have embracing open data, the USA jumped from average score of 19/100 to 30/100. Singapore rose from 0 to 20. The Slovak Republic from 20 to 40. Bulgaria wet from 35 to 90.  Japan rose from 0 to 70 — the biggest increase of the year….(More)”