Reimagining Democracy: What if votes were a currency? A crypto-currency?


Opinion piece by Praphul Chandra: “… The first key tenet of this article is that the institution of representative democracy is a severely limited realization of democratic principles. These limitations span three dimensions:

First, citizen representation is extremely limited. The number of individuals whose preference an elected representative is supposed to represent is so large as to be essentially meaningless.

The problem is exacerbated in a rapidly urbanizing world with increasing population densities but without a corresponding increase in the number of representatives. Furthermore, since urban settings often have individuals from very different cultural backgrounds, their preferences are diverse too.

Is it realistic to expect that a single individual would be able to represent the preferences of such large & diverse communities?

Second, elected representatives have limited accountability. The only opportunity that citizens have to hold elected representatives accountable is often years away — ample time for incidents to be forgotten and perceptions to be manipulated. Since human memory over-emphasizes the recent past, elected representatives manipulate perception of their performance by populist measures closer to forthcoming elections.

Third, citizen cognition is not leveraged. The current model where default participation is limited to choosing representatives every few years does not engage the intelligence of citizens in solving the societal challenges we face today. Instead, it treats citizens as consumers offering them a menu card to choose their favourite representative.

To summarize, representative democracy does not scale well. With our societies becoming denser, more interconnected and more complex, the traditional tools of democracy are no longer effective.

Design Choices of Representative Democracy: Consider the following thought experiment: what would happen if we think of votes as a currency? Let’s call such a voting currency — GovCoin. In today’s representative democracy,

(i) GovCoins are in short supply — one citizen gets one GovCoin (vote) every 4–5 years.

(ii) GovCoins (Votes) have a very high negative rate: if you do not use them on election day, they lose all value.

(iii) GovCoins (Votes) are “accepted” by very few people: you can give your GovCoins to only pre-selected “candidates”

These design choices reflect fundamental design choices of representative democracy — they were well suited for the time when they were designed:

Since governance needs continuity and since elections were a costly and time-consuming exercise, citizens elected representatives once every 4–5 years. This also meant that elections had to be coordinated — so participation was coordinated to a particular election day requiring citizens to vote simultaneously.

Since the number of people who were interested in politics as a full-time profession was limited, the choice set of representatives was limited to a few candidates.

Are these design choices valid today? Do we really need citizens physically travelling to polling booths? With today’s technology? Must the choice of citizen participation in governance be binary: either jump in full time or be limited to vote once every 4–5 years? Aren’t there other forms of participation in this spectrum? Is limiting participation the only way to ensure governance continuity?

Rethinking Democracy: What if we reconsider the design choices of democracy? Let’s say we:

(i) increase the supply of GovCoins so that every citizen gets one unit every month;

(ii) relax the negative rate so that even if you do not “use” your GovCoin, you do not lose it i.e. you can accumulate GovCoins and use them at a later time;

(iii) enable you to give your GovCoins to anyone or any public issue / project.

What would be the impact of these design choices?

By increasing the supply of GovCoins, we inject liquidity into the system so that information (about citizens’ preferences & beliefs) can flow more fluidly. This effectively increases the participation potential of citizens in governance. Rather than limiting participation to once every 4–5 years, citizens can participate as much and as often as they want. This is a fundamental change when we consider institutions as information processing systems.

By enabling citizens to transfer GovCoins to anyone, we realize a form of liquid democracy where I can delegate my influence to you — maybe because I trust your judgement and believe that your choice will be beneficial to me as well. In effect, we have changed the default option of participation from ‘opt out’ to ‘opt in’ — every citizen can receive GovCoins from every other citizen. The total GovCoins a citizen holds is a measure of how much influence she holds in democratic decisions. We evolve from a binary system (elected representative or citizen) to a continuous spectrum where your GovCoin ‘wealth’ is measure of your social capital.

By enabling citizens to transfer GovCoins directly to a policy decision, we realize a form of direct democracy where citizens can express their preferences (and the strength of their preferences) on an issue directly rather than relying on a representative to do so.

By allowing citizens to accumulate GovCoins, we allow them to participate when they want. If I feel strongly about an issue, I can spend my GovCoins and influence this decision; If I am indifferent about an issue, I hold on to my GovCoins so that I can have a larger influence in future decisions. A small negative interest rate on GovCoins may still be needed to ensure that (i) citizens do not hoard the currency and (ii) to ensure that net influence of any individual is finite and time bounded.

Realizing Democracy: Given today’s technology landscape, realizing a democracy with new design choices is no longer a pipe dream. The potential to do this is here and now. A key enabling technology is blockchains (or Distributed Ledger Technologies) which allow the creation of new currencies. Implementing votes as a currency opens the door to realizing new forms of democracy….(More)”.

Digital platforms for facilitating access to research infrastructures


New OECD paper: “Shared research infrastructures are playing an increasingly important role in most scientific fields and represent a significant proportion of the total public investment in science. Many of these infrastructures have the potential to be used outside of their traditional scientific domain and outside of the academic community but this potential if often not fully realised.  A major challenge for potential users (and for policy-makers) is simply identifying what infrastructures are available under what conditions.

This report includes an analysis of 8 case studies of digital platforms that collate information and provide services to promote broader access to, and more effective use of, research infrastructures. Although there is considerable variety amongst the cases, a number of key issues are identified that can help guide policy-makers, funders, institutions and managers, who are interested in developing or contributing to such platforms….(More)”.

Big Data and medicine: a big deal?


V. Mayer-Schönberger and E. Ingelsson in the Journal of Internal Medicine: “Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research.

Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data’s role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval….(More)”.

How Tenants Use Digital Mapping to Track Bad Landlords and Gentrification


Hannah Norman at Yes! Magazine: “When Teresa Salazar first encountered the notice posted to her front door—which offered tenants $10,000 to move out of their East Oakland, California, apartment building—she knew the place she called home was in jeopardy.

“All of us were surprised and afraid because it is not easy to move to some other place when the rents are so high,” Salazar said in a video produced by the Anti-Eviction Mapping Project. The project uses mapping as well as data analysis and digital storytelling as organizing tools for low-income tenants to combat eviction and displacement amid the Bay Area’s raging housing crisis.

The jarring move-out offer was left by the Bay Area Property Group, founded by landlord attorney Daniel Bornstein—known for holding landlord workshops on how to evict tenants. The property management firm buys and flips apartment buildings, Salazar said, driving gentrification in neighborhoods like hers. In fear of being displaced, Salazar and other tenants from her building met with counselors from Causa Justa :: Just Cause, a community legal services group. There, they learned about their rights under Oakland’s Just Cause of Eviction Ordinance. With this information, they successfully stood their ground and remained in their homes.

But not all Bay Area tenants are as fortunate as Salazar. Between 2005 and 2015, Oakland witnessed more than 32,402 unlawful detainers, or eviction proceedings, according to data obtained by AEMP through record requests. But AEMP hopes to change these statistics by arming tenants and housing advocates with map-based data to fight evictions and displacements and, ultimately, drive local and state policies on the issue. In addition to mapping, AEMP uses videos of tenants like Salazar to raise awareness of the human experience behind jaw-dropping statistics.

The project is part of a rising tide of social justice cartography, where maps are being harnessed for activism as the technology becomes more accessible….(More)”.

Artificial intelligence and smart cities


Essay by Michael Batty at Urban Analytics and City Sciences: “…The notion of the smart city of course conjures up these images of such an automated future. Much of our thinking about this future, certainly in the more popular press, is about everything ranging from the latest App on our smart phones to driverless cars while somewhat deeper concerns are about efficiency gains due to the automation of services ranging from transit to the delivery of energy. There is no doubt that routine and repetitive processes – algorithms if you like – are improving at an exponential rate in terms of the data they can process and the speed of execution, faithfully following Moore’s Law.

Pattern recognition techniques that lie at the basis of machine learning are highly routinized iterative schemes where the pattern in question – be it a signature, a face, the environment around a driverless car and so on – is computed as an elaborate averaging procedure which takes a series of elements of the pattern and weights them in such a way that the pattern can be reproduced perfectly by the combinations of elements of the original pattern and the weights. This is in essence the way neural networks work. When one says that they ‘learn’ and that the current focus is on ‘deep learning’, all that is meant is that with complex patterns and environments, many layers of neurons (elements of the pattern) are defined and the iterative procedures are run until there is a convergence with the pattern that is to be explained. Such processes are iterative, additive and not much more than sophisticated averaging but using machines that can operate virtually at the speed of light and thus process vast volumes of big data. When these kinds of algorithm can be run in real time and many already can be, then there is the prospect of many kinds of routine behaviour being displaced. It is in this sense that AI might herald in an era of truly disruptive processes. This according to Brynjolfsson and McAfee is beginning to happen as we reach the second half of the chess board.

The real issue in terms of AI involves problems that are peculiarly human. Much of our work is highly routinized and many of our daily actions and decisions are based on relatively straightforward patterns of stimulus and response. The big questions involve the extent to which those of our behaviours which are not straightforward can be automated. In fact, although machines are able to beat human players in many board games and there is now the prospect of machines beating the very machines that were originally designed to play against humans, the real power of AI may well come from collaboratives of man and machine, working together, rather than ever more powerful machines working by themselves. In the last 10 years, some of my editorials have tracked what is happening in the real-time city – the smart city as it is popularly called – which has become key to many new initiatives in cities. In fact, cities – particularly big cities, world cities – have become the flavour of the month but the focus has not been on their long-term evolution but on how we use them on a minute by minute to week by week basis.

Many of the patterns that define the smart city on these short-term cycles can be predicted using AI largely because they are highly routinized but even for highly routine patterns, there are limits on the extent to which we can explain them and reproduce them. Much advancement in AI within the smart city will come from automation of the routine, such as the use of energy, the delivery of location-based services, transit using information being fed to operators and travellers in real time and so on. I think we will see some quite impressive advances in these areas in the next decade and beyond. But the key issue in urban planning is not just this short term but the long term and it is here that the prospects for AI are more problematic….(More)”.

Toward Information Justice


Book by Jeffrey Alan Johnson: “…presents a theory of information justice that subsumes the question of control and relates it to other issues that influence just social outcomes. Data does not exist by nature. Bureaucratic societies must provide standardized inputs for governing algorithms, a problem that can be understood as one of legibility. This requires, though, converting what we know about social objects and actions into data, narrowing the many possible representations of the objects to a definitive one using a series of translations. Information thus exists within a nexus of problems, data, models, and actions that the social actors constructing the data bring to it.

This opens information to analysis from social and moral perspectives, while the scientistic view leaves us blind to the gains from such analysis—especially to the ways that embedded values and assumptions promote injustice. Toward Information Justice answers a key question for the 21st Century: how can an information-driven society be just?

Many of those concerned with the ethics of data focus on control over data, and argue that if data is only controlled by the right people then just outcomes will emerge. There are serious problems with this control metaparadigm, however, especially related to the initial creation of data and prerequisites for its use.  This text is suitable for academics in the fields of information ethics, political theory, philosophy of technology, and science and technology studies, as well as policy professionals who rely on data to reach increasingly problematic conclusions about courses of action….(More)”.

Open Banking ‘revolution’ to challenge banks’ dominance


Shane Hickey at the Guardian: “This week sees the beginning of a quiet revolution in banking which some have championed as one of the greatest shake-ups in personal finance in years, while others have warned it could have serious implications for people’s private data.

It’s the start of a new series of rules concerning “open banking”, where customers will be able to share their personal financial information with companies other than their bank, opening up opportunities to get better deals on mortgages, overdrafts and comparing insurance and broadband deals.

For eager enthusiasts, it will revolutionise banking, make the system more competitive, and give consumers access to the best products for them.

For the more sceptical, among them consumer groups, it could cause problems with the security of previously private data…..

Financial data about how you spend your money, how often you are overdrawn and other details are currently held by your bank.

Under the new rules, the ownership of this data will essentially be transferred to the consumer, meaning that account holders will be able to give companies, other than their own bank, permission to access their details. Underlying this new regulation, which will spread across Europe this month, are EU rules that mean financial institutions must let customers share their data easily and securely. Extra measures are being taken in the UK to push through the changes, with the setting up of standards so data can be shared securely….

While consumers may feel empowered that their data is now in their own hands to do with as they please, with the potential to save money, it could also lead to unease, particularly when dealing with third parties that have brand names they might not recognise, says Shaw.

“One of the things to be mindful of is that consumers could find themselves in a complicated chain of providers. If you authorise one third party to access your money, and if there are potential losses, where does that fall?

“I think data regulators and financial regulators need to be really clear with consumers about how that is going to work. In order for consumers to really engage with this, they need to be confident that there are safeguards in place to protect them. There has been good progress on that.”…(More)”.

Who Owns Urban Mobility Data?


David Zipper at City Lab: “How, exactly, should policymakers respond to the rapid rise of new private mobility services such as ride-hailing, dockless shared bicycles, and microtransit?   … The most likely solution is via a data exchange that anonymizes rider data and gives public experts (and perhaps academic and private ones too) the ability to answer policy questions.

This idea is starting to catch on. The World Bank’s OpenTraffic project, founded in 2016, initially developed ways to aggregate traffic information derived from commercial fleets. A handful of private companies like Grab and Easy Taxi pledged their support when OpenTraffic launched. This fall, the project become part of SharedStreets, a collaboration between the National Association of City Transportation Officials (NACTO), the World Resources Institute, and the OECD’s International Transport Forum to pilot new ways of collecting and sharing a variety of public and private transport data. …(More).

Can Big Data Revolutionize International Human Rights Law?


Galit A. Sarfaty in the Journal of International Law: “International human rights efforts have been overly reliant on reactive tools and focused on treaty compliance, while often underemphasizing the prevention of human rights violations. I argue that data analytics can play an important role in refocusing the international human rights regime on its original goal of preventing human rights abuses, but it comes at a cost.

There are risks in advancing a data-driven approach to human rights, including the privileging of certain rights subject to quantitative measurement and the precipitation of further human rights abuses in the process of preventing other violations. Moreover, the increasing use of big data can ultimately privatize the international human rights regime by transforming the corporation into a primary gatekeeper of rights protection. Such unintended consequences need to be addressed in order to maximize the benefits and minimize the risks of using big data in this field….(More)”.

People-Led Innovation: Toward a Methodology for Solving Urban Problems in the 21st Century


New Methodology by Andrew Young, Jeffrey Brown, Hannah Pierce, and Stefaan G. Verhulst: “More and more people live in urban settings. At the same time, and often resulting from the growing urban population, cities worldwide are increasingly confronted with complex environmental, social, and economic shocks and stresses. When seeking to develop adequate and sustainable responses to these challenges, cities are realizing that traditional methods and existing resources often fall short.

people-led-innovation-coverAddressing 21st century challenges will require innovative approaches.

People-Led Innovation: Toward a Methodology for Solving Urban Problems in the 21st Century,” is a new methodology by The GovLab and Bertelsmann Foundation aimed at empowering public entrepreneurs, particularly city-level government officials, to engage the capacity and expertise of people in solving major public challenges. This guide focuses on unlocking an undervalued asset for innovation and the co-creation of solutions: people and their expertise…..

Designed for city officials, and others seeking ways to improve people’s lives, the methodology provides:

  • A phased approach to helping leaders develop approaches in an iterative manner that is more effective and legitimate by placing people, and groups of people, at the center of all stages of problem-solving process, including: problem definition, ideation, experimentation, and iteration.
  • A flexible framework that instead of rigid prescriptions, provides suggested checklists to probe a more people-led approach when developing innovative solutions to urban challenges.
  • A matrix to determine what kind of engagement (e.g., commenting, co-creating, reviewing, and/or reporting), and by whom (e.g., community-based organizations, residents, foundation partners, among others) is most appropriate at what stage of the innovation lifecycle.
  • A curation of inspirational examples, set at each phase of the methodology, where public entrepreneurs and others have sought to create positive impacts by engaging people in practice….(More)”.