SMS texts on corruption help Ugandan voters hold elected councillors accountable at the polls


Paper by Mark T. BuntaineRyan JablonskiDaniel L. Nielson, and Paula M. Pickering: “Many politicians manipulate information to prevent voters from holding them accountable; however, mobile text messages may make it easier for nongovernmental organizations to credibly share information on official corruption that is difficult for politicians to counter directly.

We test the potential for texts on budget management to improve democratic accountability by conducting a large (n = 16,083) randomized controlled trial during the 2016 Ugandan district elections. In cooperation with a local partner, we compiled, simplified, and text-messaged official information on irregularities in local government budgets.

Verified recipients of messages that described more irregularities than expected reported voting for incumbent councillors 6% less often; verified recipients of messages conveying fewer irregularities than expected reported voting for incumbent councillors 5% more often. The messages had no observable effect on votes for incumbent council chairs, potentially due to voters’ greater reliance on other sources of information for higher profile elections.

These mixed results suggest that text messages on budget corruption help voters hold some politicians accountable in settings where elections are not free and fair….(More)”

Prizes are a powerful spur to innovation and breakthroughs


John Thornhill in the Financial Times: “…All too often today we leave research and innovation in the hands of the so-called professionals, often with disappointing results. Winning a prize often matters less than the stimulus it provides for innovators in neighbouring fields In recent years, there has been an explosion in the number of professional scientists. Unesco estimates that there were 7.8m full-time researchers in 2013.

The number of scientific journals has also increased, making it difficult even for specialists to remain on top of all the latest advances in their field. In spite of this explosion of knowledge and research spending, there has been a striking lack of breakthrough innovations, as economists such as Robert Gordon and Tyler Cowen have noted.

Maybe this is because all the low-hanging technological fruit has been eaten. Or perhaps it is because our research and development methodology has gone awry.

Geoff Mulgan, chief executive of Nesta, is one of those who is trying to revive the concept of prizes as a means of encouraging innovation. His public foundation runs the Challenge Prize Centre, offering awards of up to £10m for innovation in the fields of energy and the environment, healthcare, and community wellbeing. “Setting a specific target, opening up to anyone to meet it, and providing a financial reward if they succeed is the opposite of how most R&D is done,” Mr Mulgan says. “We should all focus more on outcomes than inputs.”…
But these prizes are far from being a panacea. Indeed, they can sometimes lead to perverse results, encouraging innovators to fixate on just one, original goal while ignoring serendipitous surprises along the way. Many innovations are the happy byproduct of research rather than its primary outcome. An academic paper on the effectiveness of innovation prizes concluded that they could be a useful addition to the armoury but were no substitute for other proven forms of research and development. The authors also warned that if prizes were poorly designed, managed, and awarded they could prove “ineffective or even harmful”.

That makes it essential to design competitions in careful and precise detail. It also helps if there are periodic payouts along the way to encourage the most promising ideas. Many companies have embraced the concept of open innovation and increasingly look to collaborate with outside partners to develop fresh ideas, sometimes by means of corporate prizes….(More)”.

Virtualization of government‐to‐citizen engagement process: Enablers and constraints


Paper by Joshua Ofoeda et al: “The purpose of this study is to investigate the factors that constrain or enable process virtualization in a government‐to‐citizen engagement process. Past research has established that most e‐government projects, especially in developing countries, are regarded as total failure or partial failure.

Citizens’ unwillingness to use government electronic services and lack of awareness are among some of the reasons why these electronic services fail.

Using the process virtualization theory (PVT) as theoretical lens, the authors investigated the various activities within the driver license acquisition process at the Driver and Vehicle Licensing Authority.

The PVT helped in identifying factors which enable or inhibit the virtualization of the driver license acquisition process in Ghana. Based on a survey data of 317 participants, we report that process characteristics in the form of relationship requirements affect citizens’ willingness toward the use of government virtualized processes. Situating the PVT within a developing country context, our findings reveal that some cultural and behavioral attributes such as socialization hinder the virtualization of some activities within the driver licensing process….(More)”.

Sentiment Analysis of Big Data: Methods, Applications, and Open Challenges


Paper by Shahid Shayaa et al at IEEE: “The development of IoT technologies and the massive admiration and acceptance of social media tools and applications, new doors of opportunity have been opened for using data analytics in gaining meaningful insights from unstructured information. The application of opinion mining and sentiment analysis (OMSA) in the era of big data have been used a useful way in categorize the opinion into different sentiment and in general evaluating the mood of the public. Moreover, different techniques of OMSA have been developed over the years in different datasets and applied to various experimental settings. In this regard, this study presents a comprehensive systematic literature review, aims to discuss both technical aspect of OMSA (techniques, types) and non-technical aspect in the form of application areas are discussed. Furthermore, the study also highlighted both technical aspect of OMSA in the form of challenges in the development of its technique and non-technical challenges mainly based on its application. These challenges are presented as a future direction for research….(More)”.

On the Rise of FinTechs – Credit Scoring using Digital Footprints


NBER Working Paper by Tobias Berg, Valentin Burg, Ana Gombović and Manju Puri: “We analyze the information content of the digital footprint – information that people leave online simply by accessing or registering on a website – for predicting consumer default. Using more than 250,000 observations, we show that even simple, easily accessible variables from the digital footprint equal or exceed the information content of credit bureau (FICO) scores. Furthermore, the discriminatory power for unscorable customers is very similar to that of scorable customers. Our results have potentially wide implications for financial intermediaries’ business models, for access to credit for the unbanked, and for the behavior of consumers, firms, and regulators in the digital sphere….(More)”.

Blockchain Ethical Design Framework


Report by Cara LaPointe and Lara Fishbane: “There are dramatic predictions about the potential of blockchain to “revolutionize” everything from worldwide financial markets and the distribution of humanitarian assistance to the very way that we outright recognize human identity for billions of people around the globe. Some dismiss these claims as excessive technology hype by citing flaws in the technology or robustness of incumbent solutions and infrastructure.

The reality will likely fall somewhere between these two extremes across multiple sectors. Where initial applications of blockchain were focused on the financial industry, current applications have rapidly expanded to address a wide array of sectors with major implications for social impact.

This paper aims to demonstrate the capacity of blockchain to create scalable social impact and to identify the elements that need to be addressed to mitigate challenges in its application. We are at a moment when technology is enabling society to experiment with new solutions and business models. Ubiquity and global reach, increased capabilities, and affordability have made technology a critical tool for solving problems, making this an exciting time to think about achieving greater social impact. We can address issues for underserved or marginalized people in ways that were previously unimaginable.

Blockchain is a technology that holds real promise for dealing with key inefficiencies and transforming operations in the social sector and for improving lives. Because of its immutability and decentralization, blockchain has the potential to create transparency, provide distributed verification, and build trust across multiple systems. For instance, blockchain applications could provide the means for establishing identities for individuals without identification papers, improving access to finance and banking services for underserved populations, and distributing aid to refugees in a more transparent and efficient manner. Similarly, national and subnational governments are putting land registry information onto blockchains to create greater transparency and avoid corruption and manipulation by third parties.

From increasing access to capital, to tracking health and education data across multiple generations, to improving voter records and voting systems, blockchain has countless potential applications for social impact. As developers take on building these types of solutions, the social effects of blockchain can be powerful and lasting. With the potential for such a powerful impact, the design, application, and approach to the development and implementation of blockchain technologies have long-term implications for society and individuals.

This paper outlines why intentionality of design, which is important with any technology, is particularly crucial with blockchain, and offers a framework to guide policymakers and social impact organizations. As social media, cryptocurrencies, and algorithms have shown, technology is not neutral. Values are embedded in the code. How the problem is defined and by whom, who is building the solution, how it gets programmed and implemented, who has access, and what rules are created have consequences, in intentional and unintentional ways. In the applications and implementation of blockchain, it is critical to understand that seemingly innocuous design choices have resounding ethical implications on people’s lives.

This white paper addresses why intentionality of design matters, identifies the key questions that should be asked, and provides a framework to approach use of blockchain, especially as it relates to social impact. It examines the key attributes of blockchain, its broad applicability as well as its particular potential for social impact, and the challenges in fully realizing that potential. Social impact organizations and policymakers have an obligation to understand the ethical approaches used in designing blockchain technology, especially how they affect marginalized and vulnerable populations….(More)”

My City Forecast: Urban planning communication tool for citizen with national open data


Paper by Y. Hasegawa, Y. Sekimoto, T. Seto, Y. Fukushima et al in Computers, Environment and Urban Systems: “In urban management, the importance of citizen participation is being emphasized more than ever before. This is especially true in countries where depopulation has become a major concern for urban managers and many local authorities are working on revising city master plans, often incorporating the concept of the “compact city.” In Japan, for example, the implementation of compact city plans means that each local government decides on how to designate residential areas and promotes citizens moving to these areas in order to improve budget effectiveness and the vitality of the city. However, implementing a compact city is possible in various ways. Given that there can be some designated withdrawal areas for budget savings, compact city policies can include disadvantages for citizens. At this turning point for urban structures, citizen–government mutual understanding and cooperation is necessary for every step of urban management, including planning.

Concurrently, along with the recent rapid growth of big data utilization and computer technologies, a new conception of cooperation between citizens and government has emerged. With emerging technologies based on civic knowledge, citizens have started to obtain the power to engage directly in urban management by obtaining information, thinking about their city’s problems, and taking action to help shape the future of their city themselves (Knight Foundation, 2013). This development is also supported by the open government data movement, which promotes the availability of government information online (Kingston, Carver, Evans, & Turton, 2000). CityDashboard is one well-known example of real-time visualization and distribution of urban information. CityDashboard, a web tool launched in 2012 by University College London, aggregates spatial data for cities around the UK and displays the data on a dashboard and a map. These new technologies are expected to enable both citizens and government to see their urban situation in an interface presenting an overhead view based on statistical information.

However, usage of statistics and governmental data is as yet limited in the actual process of urban planning…

To help improve this situation and increase citizen participation in urban management, we have developed a web-based urban planning communication tool using open government data for enhanced citizen–government cooperation. The main aim of the present research is to evaluate the effect of our system on users’ awareness of and attitude toward the urban situation. We have designed and developed an urban simulation system, My City Forecast (http://mycityforecast.net,) that enables citizens to understand how their environment and region are likely to change by urban management in the future (up to 2040)….(More)”.

Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information


Paper by Guido Noto La Diega: “Nowadays algorithms can decide if one can get a loan, is allowed to cross a border, or must go to prison. Artificial intelligence techniques (natural language processing and machine learning in the first place) enable private and public decision-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way.

This work presents ten arguments against algorithmic decision-making. These revolve around the concepts of ubiquitous discretionary interpretation, holistic intuition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy.

The lack of transparency of the algorithmic decision-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the decision. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing.

To counter the increased monopolisation of algorithms by means of intellectual property rights (with trade secrets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms.

First, copyright and patent exceptions, as well as trade secrets are discussed.

Second, the GDPR is critically assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful information about the logic involved in the algorithmic decision.

Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm.

Only an integrated approach – which takes into account intellectual property, data protection, and freedom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights….(More)”.

Who wants to know?: The Political Economy of Statistical Capacity in Latin America


IADB paper by Dargent, Eduardo; Lotta, Gabriela; Mejía-Guerra, José Antonio; Moncada, Gilberto: “Why is there such heterogenity in the level of technical and institutional capacity in national statistical offices (NSOs)? Although there is broad consensus about the importance of statistical information as an essential input for decision making in the public and private sectors, this does not generally translate into a recognition of the importance of the institutions responsible for the production of data. In the context of the role of NSOs in government and society, this study seeks to explain the variation in regional statistical capacity by comparing historical processes and political economy factors in 10 Latin American countries. To do so, it proposes a new theoretical and methodological framework and offers recommendations to strengthen the institutionality of NSOs….(More)”.

Preprints: The What, The Why, The How.


Center for Open Science: “The use of preprint servers by scholarly communities is definitely on the rise. Many developments in the past year indicate that preprints will be a huge part of the research landscape. Developments with DOIs, changes in funder expectations, and the launch of many new services indicate that preprints will become much more pervasive and reach beyond the communities where they started.

From funding agencies that want to realize impact from their efforts sooner to researchers’ desire to disseminate their research more quickly, the growth of these servers and the number of works being shared, has been substantial. At COS, we already host twenty different organizations’ services via the OSF Preprints platform.

So what’s a preprint and what is it good for? A preprint is a manuscript submitted to a  dedicated repository (like OSF PreprintsPeerJbioRxiv or arXiv) prior to peer review and formal publication. Some of those repositories may also accept other types of research outputs, like working papers and posters or conference proceedings. Getting a preprint out there has a variety of benefits for authors other stakeholders in the research:

  • They increase the visibility of research, and sooner. While traditional papers can languish in the peer review process for months, even years, a preprint is live the minute it is submitted and moderated (if the service moderates). This means your work gets indexed by Google Scholar and Altmetric, and discovered by more relevant readers than ever before.
  • You can get feedback on your work and make improvements prior to journal submission. Many authors have publicly commented about the recommendations for improvements they’ve received on their preprint that strengthened their work and even led to finding new collaborators.
  • Papers with an accompanying preprint get cited 30% more often than papers without. This research from PeerJsums it up, but that’s a big benefit for scholars looking to get more visibility and impact from their efforts.
  • Preprints get a permanent DOI, which makes them part of the freely accessible scientific record forever. This means others can relay on that permanence when citing your work in their research. It also means that your idea, developed by you, has a “stake in the ground” where potential scooping and intellectual theft are concerned.

So, preprints can really help lubricate scientific progress. But there are some things to keep in mind before you post. Usually, you can’t post a preprint of an article that’s already been submitted to a journal for peer review. Policies among journals vary widely, so it’s important to check with the journal you’re interested in sending your paper to BEFORE you submit a preprint that might later be published. A good resource for doing this is JISC’s SHERPA/RoMEO database. It’s also a good idea to understand the licensing choices available. At OSF Preprints, we recommend the CC-BY license suite, but you can check choosealicense.com or https://osf.io/6uupa/ for good overviews on how best to license your submissions….(More)”.