From smart city to open city: Lessons from Jakarta Smart City


Putri, D.A., CH Karlina, M., Tanaya, J., at the Centre for Innovation Policy and Governance, Indonesia: “In 2011, Indonesia started its Open Government journey when along with seven other countries it initiated Open Government Partnership. Following the global declaration, Indonesia launched the Open Government Indonesia (OGI) in January 2012 with the aim to introduce open government reforms, including open data. This initiative is supported by Law No. 14/2008 on Freedom of Information. Despite its early stage, the implementation of Open Government in Indonesia has shown promising developments, with three action plans enacted in the last four years. In the Southeast Asian region, Indonesia could be considered a pioneer in implementing the open data initiative at national as well as sub-national levels. In some cases, the open data initiative at sub- national level has even surpassed the progress at the national level. Jakarta, for example, became the first city to have its own gubernatorial bylaw on data and system management, which requires the city administration and agencies to open its public data, thus leading to the birth of open data initiatives in the city. The city also have Jakarta Smart City that connect sub-districts officials with the citizen. Jakarta Smart City is an initiative that promote openness of the government through public service delivery. This paper aims to take a closer look on the dynamics of citizens-generated data in Jakarta and how Jakarta smart city program contributes to the implementation of open data….(More)”

Open data and its usability: an empirical view from the Citizen’s perspective


Paper by Weerakkody, V., Irani, Z., Kapoor, K. et al. in Information Systems FrontiersGovernment legislation and calls for greater levels of oversight and transparency are leading public bodies to publish their raw datasets online. Policy makers and elected officials anticipate that the accessibility of open data through online Government portals for citizens will enable public engagement in policy making through increased levels of fact based content elicited from open data. The usability and benefits of such open data are being argued as contributing positively towards public sector reforms, which are under extreme pressures driven by extended periods of austerity. However, there is very limited scholarly studies that have attempted to empirically evaluate the performance of government open data websites and the acceptance and use of these data from a citizen perspective. Given this research void, an adjusted diffusion of innovation model based on Rogers’ diffusion of innovations theory (DOI) is proposed and used in this paper to empirically determine the predictors influencing the use of public sector open data. A good understanding of these predictors affecting the acceptance and use of open data will likely assist policy makers and public administrations in determining the policy instruments that can increase the acceptance and use of open data through an active promotion campaign to engage-contribute-use….(More)”

Use of big data risks making some people uninsurable


Oliver Ralph at the Financial Times: “More sophisticated use of data could create an “underclass” of people who cannot afford insurance. According to a new report from the Chartered Institute of Insurance, consumers could miss out on some types of cover altogether if insurers deem them too risky.

Big data are one of the insurance industry’s great hopes for the future. Established insurers and a host of start-ups are investing millions in new systems to better understand the information they hold about customers, and to collect more data. They hope that by better analysing the risks that each policyholder faces, they can not only price their products more accurately but also advise customers on how to avoid problems.

However, the CII paper warns that using data in this way threatens the concept of pooling risk on which the industry was founded.

“Data is a double-edged sword,” said David Thomson, director of policy and public affairs at the CII. “The insurance sector needs to be careful about moving away from pooled risk into individual pricing. They need to think about the broader public interest.”

The report says that the concept of pooling risk “underpins the effectiveness of insurance cover”.

It adds: “Some people may be identified as such high risk to insurers that they are priced out of insurance altogether. Big data could, in effect, create groups of ‘uninsurable’ people. While in some cases this may be to do with modifiable behaviour, like driving style, it could easily be due to factors that people can’t control, such as where they live, age, genetic conditions or health problems.”

The issue of genetic data is a particularly contentious one.

In theory, genetic data could be useful to insurers when deciding how to price life or health insurance. Because of the ethical questions this poses, an agreement signed in 2000 between the government and the Association of British Insurers stops the industry from using predictive genetic test results. The agreement runs until 2019, although a review is due this year.

“You could price people out of the market for health products. There’s a danger insurers will not offer health cover to some people. The government would intervene if people are doing social sorting,” said Mr Thomson.

Better use of data in other areas has already forced the government to act. Improved mapping and data analysis have allowed insurers to more accurately assess which homes and businesses run a high risk of flooding. Many people complained that the resulting prices made cover unaffordable for people living in areas at risk….(More)”.

Inside Government: The role of policy actors in shaping e-democracy in the UK


Thesis by Mary Houston: “The thesis focuses on the emergence of e-democracy in the UK between 1999 and 2013. It examines the part that policy actors have played in shaping the agenda. Emphasis is placed on how e-democracy is understood by those charged with developing initiatives and implementing government policy on e-democracy. Previous research on e-democracy has focused largely on the impact of Web technologies on political systems and/or on how, why and to what degree, citizens participate. Less attention is paid to what happens inside government, in how policy actors’ conceive public engagement in the policy process. Their perceptions and shared understandings are crucial to the commissioning, implementation, or deflection of participatory opportunities. This thesis is concerned with exploring how policy actors experience, interpret and negotiate e-democracy policy and practices and their perceptions of citizen involvement in the policy process. Competing discourses shape institutional expectations of e-democracy in the UK. The research examines how policy actors draw upon wider discourses such as the modernisation of government and the emphasis on transparency. It analyses understandings of technologies in government and the effects of relational interactions and linkages in policy and practice….(More)”

Democracy Does Not Cause Growth: The Importance of Endogeneity Arguments


IADB Working Paper by JEL Codes:”This article challenges recent findings that democracy has sizable effects on economic growth. As extensive political science research indicates that economic turmoil is responsible for causing or facilitating many democratic transitions, the paper focuses on this endogeneity concern. Using a worldwide survey of 165 country-specific democracy experts conducted for this study, the paper separates democratic transitions into those occurring for reasons related to economic turmoil, here called endogenous, and those grounded in reasons more exogenous to economic growth. The behavior of economic growth following these more exogenous democratizations strongly indicates that democracy does not cause growth. Consequently, the common positive association between democracy and economic growth is driven by endogenous democratization episodes (i.e., due to faulty identification)….(More)”

Data at the Speed of Life


Marc Gunther at The Chronicle of Philanthropy: “Can pregnant women in Zambia be persuaded to deliver their babies in hospitals or clinics rather than at home? How much are villagers in Cambodia willing to pay for a simple latrine? What qualities predict success for a small-scale entrepreneur who advises farmers?

Governments, foundations, and nonprofits that want to help the world’s poor regularly face questions like these. Answers are elusive. While an estimated $135 billion in government aid and another $15 billion in charitable giving flow annually to developing countries, surprisingly few projects benefit from rigorous evaluations. Those that do get scrutinized in academic studies often don’t see the results for years, long after the projects have ended.

IDinsight puts data-driven research on speed. Its goal is to produce useful, low-cost research results fast enough that nonprofits can use it make midcourse corrections to their programs….

IDinsight calls this kind of research “decision-focused evaluation,” which sets it apart from traditional monitoring and evaluation (M&E) and academic research. M&E, experts say, is mostly about accountability and outputs — how many training sessions were held, how much food was distributed, and so on. Usually, it occurs after a program is complete. Academic studies are typically shaped by researchers’ desire to break new ground and publish on topics of broad interest. The IDinsight approach aims instead “for contemporaneous decision-making rather than for publication in the American Economic Review,” says Ruth Levine, who directs the global development program at the William and Flora Hewlett Foundation.

A decade ago, Ms. Levine and William Savedoff, a senior fellow at the Center for Global Development, wrote an influential paper entitled “When Will We Ever Learn? Improving Lives Through Impact Evaluation.” They lamented that an “absence of evidence” for the effectiveness of global development programs “not only wastes money but denies poor people crucial support to improve their lives.”

Since then, impact evaluation has come a “huge distance,” Ms. Levine says….

Actually, others are. Innovations for Poverty Action recently created the Goldilocks Initiative to do what it calls “right fit” evaluations leading to better policy and programs, according to Thoai Ngo, who leads the effort. Its first clients include GiveDirectly, which facilitates cash transfers to the extreme poor, and Splash, a water charity….All this focus on data has generated pushback. Many nonprofits don’t have the resources to do rigorous research, according to Debra Allcock Tyler, chief executive at Directory of Social Change, a British charity that provides training, data, and other resources for social enterprises.

All this focus on data has generated pushback. Many nonprofits don’t have the resources to do rigorous research, according to Debra Allcock Tyler, chief executive at Directory of Social Change, a British charity that provides training, data, and other resources for social enterprises.

“A great deal of the time, data is pointless,” Allcock Tyler said last year at a London seminar on data and nonprofits. “Very often it is dangerous and can be used against us, and sometimes it takes away precious resources from other things that we might more usefully do.”

A bigger problem may be that the accumulation of knowledge does not necessarily lead to better policies or practices.

“People often trust their experience more than a systematic review,” says Ms. Levine of the Hewlett Foundation. IDinsight’s Esther Wang agrees. “A lot of our frustration is looking at the development world and asking why are we not accountable for the money that we are spending,” she says. “That’s a waste that none of us really feels is justifiable.”…(More)”

Mapping and Comparing Responsible Data Approaches


New report by Jos Berens, Ulrich Mans and Stefaan Verhulst: “Recent years have witnessed something of a sea-change in the way humanitarian organizations consider and use data. Growing awareness of the potential of data has led to new enthusiasm and new, innovative applications that seek to respond to and mitigate crises in fresh ways. At the same time, it has become apparent that the potential benefits are accompanied by risks. A new framework is needed that can help balance the benefits and risks, and that can aid humanitarian organizations and others (e.g., policymakers) develop a more responsible approach to data collection and use in their efforts to combat natural and man-made crises around the world. …

Screen Shot 2016-07-06 at 9.31.58 AMThe report we are releasing today, “Mapping and Comparing Responsible Data Approaches”, attempts to guide the first steps toward such a framework by learning from current approaches and principles. It is the outcome of a joint research project commissioned by UNOCHA and conducted in collaboration between the GovLab at NYU and Leiden University. In an effort to better understand the landscape, we have considered existing data use policies and principles from 17 organizations. These include 7 UN agencies, 7 International Organizations, 2 government agencies and 1 research institute. Our study of these organizations’ policies allowed us to extract a number of key takeaways that, together, amount to something like a roadmap for responsible data use for any humanitarian organization considering using data in new ways.

We began our research by closely mapping the existing responsible data use policies. To do this, we developed a template with eight broad themes that determines the key ingredients of responsible data framework. This use of a consistent template across organizations permits us to study and compare the 17 data use policies in a structured and systematic manner. Based on this template, we were able to extract 7 key takeaways for what works best when using data in a humanitarian context – presented in the conclusion to the paper being released today. They are designed to be broad enough to be broadly applicable, yet specific enough to be operational and actually usable….(More)”

OpenData.Innovation: an international journey to discover innovative uses of open government data


Nesta: “This paper by Mor Rubinstein (Open Knowledge International) and Josh Cowls and Corinne Cath (Oxford Internet Institute) explores the methods and motivations behind innovative uses of open government data in five specific country contexts – Chile, Argentine, Uruguay, Israel, and Denmark; and considers how the insights it uncovers might be adopted in a UK context.

Through a series of interviews with ‘social hackers’ and open data practitioners and experts in countries with recognised open government data ‘hubs’, the authors encountered a diverse range of practices and approaches in how actors in different sectors of society make innovative uses of open government data. This diversity also demonstrated how contextual factors shape the opportunities and challenges for impactful open government data use.

Based on insights from these international case studies, the paper offers a number of recommendations – around community engagement, data literacy and practices of opening data – which aim to support governments and citizens unlock greater knowledge exchange and social impact through open government data….(More)”

Privacy concerns in smart cities


Liesbet van Zoonen in Government Information Quarterly: “In this paper a framework is constructed to hypothesize if and how smart city technologies and urban big data produce privacy concerns among the people in these cities (as inhabitants, workers, visitors, and otherwise). The framework is built on the basis of two recurring dimensions in research about people’s concerns about privacy: one dimensions represents that people perceive particular data as more personal and sensitive than others, the other dimension represents that people’s privacy concerns differ according to the purpose for which data is collected, with the contrast between service and surveillance purposes most paramount. These two dimensions produce a 2 × 2 framework that hypothesizes which technologies and data-applications in smart cities are likely to raise people’s privacy concerns, distinguishing between raising hardly any concern (impersonal data, service purpose), to raising controversy (personal data, surveillance purpose). Specific examples from the city of Rotterdam are used to further explore and illustrate the academic and practical usefulness of the framework. It is argued that the general hypothesis of the framework offers clear directions for further empirical research and theory building about privacy concerns in smart cities, and that it provides a sensitizing instrument for local governments to identify the absence, presence, or emergence of privacy concerns among their citizens….(More)”

Crowdsourcing privacy policy analysis: Potential, challenges and best practices


Paper by , and : “Privacy policies are supposed to provide transparency about a service’s data practices and help consumers make informed choices about which services to entrust with their personal information. In practice, those privacy policies are typically long and complex documents that are largely ignored by consumers. Even for regulators and data protection authorities privacy policies are difficult to assess at scale. Crowdsourcing offers the potential to scale the analysis of privacy policies with microtasks, for instance by assessing how specific data practices are addressed in privacy policies or extracting information about data practices of interest, which can then facilitate further analysis or be provided to users in more effective notice formats. Crowdsourcing the analysis of complex privacy policy documents to non-expert crowdworkers poses particular challenges. We discuss best practices, lessons learned and research challenges for crowdsourcing privacy policy analysis….(More)”