Book edited by Gil-Garcia, J. Ramon, Pardo, Theresa A., Nam, Taewoo: “This book will provide one of the first comprehensive approaches to the study of smart city governments with theories and concepts for understanding and researching 21st century city governments innovative methodologies for the analysis and evaluation of smart city initiatives. The term “smart city” is now generally used to represent efforts that in different ways describe a comprehensive vision of a city for the present and future. A smarter city infuses information into its physical infrastructure to improve conveniences, facilitate mobility, add efficiencies, conserve energy, improve the quality of air and water, identify problems and fix them quickly, recover rapidly from disasters, collect data to make better decisions, deploy resources effectively and share data to enable collaboration across entities and domains. These and other similar efforts are expected to make cities more intelligent in terms of efficiency, effectiveness, productivity, transparency, and sustainability, among other important aspects. Given this changing social, institutional and technology environment, it seems feasible and likeable to attain smarter cities and by extension, smarter governments: virtually integrated, networked, interconnected, responsive, and efficient. This book will help build the bridge between sound research and practice expertise in the area of smarter cities and will be of interest to researchers and students in the e-government, public administration, political science, communication, information science, administrative sciences and management, sociology, computer science, and information technology. As well as government officials and public managers who will find practical recommendations based on rigorous studies that will contain insights and guidance for the development, management, and evaluation of complex smart cities and smart government initiatives.…(More)”
The impact of Open Data
GovLab/Omidyar Network: “…share insights gained from our current collaboration with Omidyar Network on a series of open data case studies. These case studies – 19, in total – are designed to provide a detailed examination of the various ways open data is being used around the world, across geographies and sectors, and to draw some over-arching lessons. The case studies are built from extensive research, including in-depth interviews with key participants in the various open data projects under study….
Ways in which open data impacts lives
Broadly, we have identified four main ways in which open data is transforming economic, social, cultural and political life, and hence improving people’s lives.
- First, open data is improving government, primarily by helping tackle corruption, improving transparency, and enhancing public services and resource allocation.
- Open data is also empowering citizens to take control of their lives and demand change; this dimension of impact is mediated by more informed decision making and new forms of social mobilization, both facilitated by new ways of communicating and accessing information.
- Open data is also creating new opportunities for citizens and groups, by stimulating innovation and promoting economic growth and development.
- Finally, open data is playing an increasingly important role insolving big public problems, primarily by allowing citizens and policymakers to engage in new forms of data-driven assessment and data-driven engagement.
Enabling Conditions
While these are the four main ways in which open data is driving change, we have seen wide variability in the amount and nature of impact across our case studies. Put simply, some projects are more successful than others; or some projects might be more successful in a particular dimension of impact, and less successful in others.
As part of our research, we have therefore tried to identify some enabling conditions that maximize the positive impact of open data projects. These four stand out:
- Open data projects are most successful when they are built not from the efforts of single organizations or government agencies, but when they emerge from partnerships across sectors (and even borders). The role of intermediaries (e.g., the media and civil society groups) and “data collaboratives” are particularly important.
- Several of the projects we have seen have emerged on the back of what we might think of as an open data public infrastructure– i.e., the technical backend and organizational processes necessary to enable the regular release of potentially impactful data to the public.
- Clear open data policies, including well-defined performance metrics, are also essential; policymakers and political leaders have an important role in creating an enabling (yet flexible) legal environment that includes mechanisms for project assessments and accountability, as well as providing the type high-level political buy-in that can empower practitioners to work with open data.
- We have also seen that the most successful open data projects tend to be those that target a well-defined problem or issue. In other words, projects with maximum impact often meet a genuine citizen need.
Challenges
Impact is also determined by the obstacles and challenges that a project confronts. Some regions and some projects face a greater number of hurdles. These also vary, but we have found four challenges that appear most often in our case studies:
- Projects in countries or regions with low capacity or “readiness”(indicated, for instance by low Internet penetration rates or hostile political environments) typically fare less well.
- Projects that are unresponsive to feedback and user needs are less likely to succeed than those that are flexible and able to adapt to what their users want.
- Open data often exists in tension with risks such as privacy and security; often, the impact of a project is limited or harmed when it fails to take into account and mitigate these risks.
- Although open data projects are often “hackable” and cheap to get off the ground, the most successful do require investments – of time and money – after their launch; inadequate resource allocation is one of the most common reasons for a project to fail.
These lists of impacts, enabling factors and challenges are, of course, preliminary. We continue to refine our research and will include a final set of findings along with our final report….(More)
Open Budget Data: Mapping the Landscape
Jonathan Gray at Open Knowledge: “We’re pleased to announce a new report, “Open Budget Data: Mapping the Landscape” undertaken as a collaboration between Open Knowledge, the Global Initiative for Financial Transparency and the Digital Methods Initiative at the University of Amsterdam.
The report offers an unprecedented empirical mapping and analysis of the emerging issue of open budget data, which has appeared as ideals from the open data movement have begun to gain traction amongst advocates and practitioners of financial transparency.
In the report we chart the definitions, best practices, actors, issues and initiatives associated with the emerging issue of open budget data in different forms of digital media.
In doing so, our objective is to enable practitioners – in particular civil society organisations, intergovernmental organisations, governments, multilaterals and funders – to navigate this developing field and to identify trends, gaps and opportunities for supporting it.
How public money is collected and distributed is one of the most pressing political questions of our time, influencing the health, well-being and prospects of billions of people. Decisions about fiscal policy affect everyone-determining everything from the resourcing of essential public services, to the capacity of public institutions to take action on global challenges such as poverty, inequality or climate change.
Digital technologies have the potential to transform the way that information about public money is organised, circulated and utilised in society, which in turn could shape the character of public debate, democratic engagement, governmental accountability and public participation in decision-making about public funds. Data could play a vital role in tackling the democratic deficit in fiscal policy and in supporting better outcomes for citizens….(More)”
Governance Networks in the Public Sector
New book by E.H. Klijn and J. Koppenjan: “Governance Networks in the Public Sector presents a comprehensive study of governance networks and the management of complexities in network settings. Public, private and non-profit organizations are increasingly faced with complex, wicked problems when making decisions, developing policies or delivering services in the public sector. These activities take place in networks of interdependent actors guided by diverging and sometimes conflicting perceptions and strategies. As a result these networks are dominated by cognitive, strategic and institutional complexities. Dealing with these complexities requires sophisticated forms of coordination: network governance.
This book presents the most recent theoretical and empirical insights into governance networks. It provides a conceptual framework and analytical tools to study the complexities involved in handling wicked problems in governance networks in the public sector. The book also discusses strategies and management recommendations for governments, business and third sector organisations operating in and governing networks….(More)”
A new journal wants to publish your research ideas
ScienceInsider: “Do you have a great idea for a study that you want to share with the world? A new journal will gladly publish it. Research Ideas and Outcomes(RIO) will also publish papers on your methods, workflows, data, reports, and software—in short, “all outputs of the research cycle.” RIO, an open-access (OA) journal, was officially launched today and will start accepting submissions in November.
at“We’re interested in making the full process of science open,” says RIO founding editor Ross Mounce, a researcher at the Natural History Museum in London. Many good research proposals fall by the wayside because funding agencies have limited budgets, Mounce says; RIO is a way to give them another chance. Mounce hopes that funders will use the journal to spot interesting new projects.
Publishing proposals can also help create links between research teams, Mounce says. “Let’s say you’re going to Madagascar for 6 months to sample turtle DNA,” he suggests. ”If you can let other researchers know ahead of time, you can agree to do things together.”
RIO‘s idea to publish research proposals is “exactly what we need if we really want to have open science,” says Iryna Kuchma, the OA program manager at the nonprofit organization Electronic Information for Libraries in Rome. Pensoft, the publishing company behind RIO, is a “strong open-access publishing venue” that has proven its worth with more than a dozen journals in the biodiversity field, Kuchma says.
The big question is, of course: Will researchers want to share promising ideas, at the risk that rivals run with them?…(More)”
A data revolution is underway. Will NGOs miss the boat?
Opinion by Sophia Ayele at Oxfam: “The data revolution has arrived. ….The UN has even launched a Data Revolution Group (to ensure that the revolution penetrates into international development). The Group’s 2014 report suggests that harnessing the power of newly available data could ultimately lead to, “more empowered people, better policies, better decisions and greater participation and accountability, leading to better outcomes for people and the planet.”
But where do NGOs fit in?
NGOs are generating dozens (if not hundreds) of datasets every year. Over the last two decades, NGO have been collecting increasing amounts of research and evaluation data, largely driven by donor demands for more rigorous evaluations of programs. The quality and efficiency of data collection has also been enhanced by mobile data collection. However, a quick scan of UK development NGOs reveals that few, if any, are sharing the data that they collect. This means that NGOs are generating dozens (if not hundreds) of datasets every year that aren’t being fully exploited and analysed. Working on tight budgets, with limited capacity, it’s not surprising that NGOs often shy away from sharing data without a clear mandate.
But change is in the air. Several donors have begun requiring NGOs to publicise data and others appear to be moving in that direction. Last year, USAID launched its Open Data Policy which requires that grantees “submit any dataset created or collected with USAID funding…” Not only does USAID stipulate this requirement, it also hosts this data on its Development Data Library (DDL) and provides guidance on anonymisation to depositors. Similarly, Gates Foundation’s 2015 Open Access Policy stipulates that, “Data underlying published research results will be accessible and open immediately.” However, they are allowing a two-year transition period…..Here at Oxfam, we have been exploring ways to begin sharing research and evaluation data. We aren’t being required to do this – yet – but, we realise that the data that we collect is a public good with the potential to improve lives through more effective development programmes and to raise the voices of those with whom we work. Moreover, organizations like Oxfam can play a crucial role in highlighting issues facing women and other marginalized communities that aren’t always captured in national statistics. Sharing data is also good practice and would increase our transparency and accountability as an organization.
… the data that we collect is a public good with the potential to improve lives. However, Oxfam also bears a huge responsibility to protect the rights of the communities that we work with. This involves ensuring informed consent when gathering data, so that communities are fully aware that their data may be shared, and de-identifying data to a level where individuals and households cannot be easily identified.
As Oxfam has outlined in our, recently adopted, Responsible Data Policy,”Using data responsibly is not just an issue of technical security and encryption but also of safeguarding the rights of people to be counted and heard, ensuring their dignity, respect and privacy, enabling them to make an informed decision and protecting their right to not be put at risk… (More)”
The Art of Managing Complex Collaborations
Eric Knight, Joel Cutcher-Gershenfeld, and Barbara Mittleman at MIT Sloan Management Review: “It’s not easy for stakeholders with widely varying interests to collaborate effectively in a consortium. The experience of the Biomarkers Consortium offers five lessons on how to successfully navigate the challenges that arise….
Society’s biggest challenges are also its most complex. From shared economic growth to personalized medicine to global climate change, few of our most pressing problems are likely to have simple solutions. Perhaps the only way to make progress on these and other challenges is by bringing together the important stakeholders on a given issue to pursue common interests and resolve points of conflict.
However, it is not easy to assemble such groups or to keep them together. Many initiatives have stumbled and disbanded. The Biomarkers Consortium might have been one of them, but this consortium beat the odds, in large part due to the founding parties’ determination to make it work. Nine years after it was founded, this public-private partnership, which is managed by the Foundation for the National Institutes of Health and based in Bethesda, Maryland, is still working to advance the availability of biomarkers (biological indicators for disease states) as tools for drug development, including applications at the frontiers of personalized medicine.
The Biomarkers Consortium’s mandate — to bring together, in the group’s words, “the expertise and resources of various partners to rapidly identify, develop, and qualify potential high-impact biomarkers particularly to enable improvements in drug development, clinical care, and regulatory decision-making” — may look simple. However, the reality has been quite complex. The negotiations that led to the consortium’s formation in 2006 were complicated, and the subsequent balancing of common and competing interests remains challenging….
Many in the biomedical sector had seen the need to tackle drug discovery costs for a long time, with multiple companies concurrently spending millions, sometimes billions, of dollars only to hit common dead ends in the drug development process. In 2004 and 2005, then National Institutes of Health director Elias Zerhouni convened key people from the U.S. Food and Drug Administration, the NIH, and the Pharmaceutical Research and Manufacturers of America to create a multistakeholder forum.
Every member knew from the outset that their fellow stakeholders represented many divergent and sometimes opposing interests: large pharmaceutical companies, smaller entrepreneurial biotechnology companies, FDA regulators, NIH science and policy experts, university researchers and nonprofit patient advocacy organizations….(More)”
The Silo Effect – The Peril of Expertise and the Promise of Breaking Down Barriers
Book by Gillian Tett: “From award-winning columnist and journalist Gillian Tett comes a brilliant examination of how our tendency to create functional departments—silos—hinders our work…and how some people and organizations can break those silos down to unleash innovation.
One of the characteristics of industrial age enterprises is that they are organized around functional departments. This organizational structure results in both limited information and restricted thinking. The Silo Effect asks these basic questions: why do humans working in modern institutions collectively act in ways that sometimes seem stupid? Why do normally clever people fail to see risks and opportunities that later seem blindingly obvious? Why, as psychologist Daniel Kahneman put it, are we sometimes so “blind to our own blindness”?
Gillian Tett, journalist and senior editor for the Financial Times, answers these questions by plumbing her background as an anthropologist and her experience reporting on the financial crisis in 2008. In The Silo Effect, she shares eight different tales of the silo syndrome, spanning Bloomberg’s City Hall in New York, the Bank of England in London, Cleveland Clinic hospital in Ohio, UBS bank in Switzerland, Facebook in San Francisco, Sony in Tokyo, the BlueMountain hedge fund, and the Chicago police. Some of these narratives illustrate how foolishly people can behave when they are mastered by silos. Others, however, show how institutions and individuals can master their silos instead. These are stories of failure and success.
From ideas about how to organize office spaces and lead teams of people with disparate expertise, Tett lays bare the silo effect and explains how people organize themselves, interact with each other, and imagine the world can take hold of an organization and lead from institutional blindness to 20/20 vision. – (More)”
5 Tips for Designing a Data for Good Initiative
Mitul Desai at Mastercard Center for Inclusive Growth: “The transformative impact of data on development projects, captured in the hashtag #DATARevolution, offers the social and private sectors alike a rallying point to enlist data in the service of high-impact development initiatives.
To help organizations design initiatives that are authentic to their identity and capabilities, we’re sharing what’s necessary to navigate the deeply interconnected organizational, technical and ethical aspects of creating a Data for Good initiative.
1) Define the need
At the center of a Data for Good initiative are the individual beneficiaries you are seeking to serve. This is foundation on which the “Good” of Data for Good rests.
Understanding the data and expertise needed to better serve such individuals will bring into focus the areas where your organization can contribute and the partners you might engage. As we’ve covered in past posts, collaboration between agents who bring different layers of expertise to Data for Good projects is a powerful formula for change….
2) Understand what data can make a difference
Think about what kind of data can tell a story that’s relevant to your mission. Claudia Perlich of Dstillery says: “The question is first and foremost, what decision do I have to make and which data can tell me something about that decision.” This great introduction to what different kinds of data are relevant in different settings can give you concrete examples.
3) Get the right tools for the job
By one estimate, some 90% of business-relevant data are unstructured or semi-structured (think texts, tweets, images, audio) as opposed to structured data like numbers that easily fit into the lines of a spreadsheet. Perlich notes that while it’s more challenging to mine this unstructured data, they can yield especially powerful insights with the right tools—which thankfully aren’t that hard to identify…..
4) Build a case that moves your organization
“While our programs are designed to serve organizations no matter what their capacity, we do find that an organization’s clarity around mission and commitment to using data to drive decision-making are two factors that can make or break a project,” says Jake Porway, founder and executive director of DataKind, a New York-based data science nonprofit that helps organizations develop Data for Good initiatives…..
5) Make technology serve people-centric ethics
The two most critical ethical factors to consider are informed consent and privacy—both require engaging the community you wish to serve as individual actors….
“Employ data-privacy walls, mask the data from the point of collection and encrypt the data you store. Ensure that appropriate technical and organizational safeguards are in place to verify that the data can’t be used to identify individuals or target demographics in a way that could harm them,” recommends Quid’s Pedraza. To understand the technology of data encryption and masking, check out this post. (More)”
The Last Mile: Creating Social and Economic Value from Behavioral Insights
New book by Dilip Soman: “Most organizations spend much of their effort on the start of the value creation process: namely, creating a strategy, developing new products or services, and analyzing the market. They pay a lot less attention to the end: the crucial “last mile” where consumers come to their website, store, or sales representatives and make a choice.
In The Last Mile, Dilip Soman shows how to use insights from behavioral science in order to close that gap. Beginning with an introduction to the last mile problem and the concept of choice architecture, the book takes a deep dive into the psychology of choice, money, and time. It explains how to construct behavioral experiments and understand the data on preferences that they provide. Finally, it provides a range of practical tools with which to overcome common last mile difficulties.
The Last Mile helps lay readers not only to understand behavioral science, but to apply its lessons to their own organizations’ last mile problems, whether they work in business, government, or the nonprofit sector. Appealing to anyone who was fascinated by Dan Ariely’s Predictably Irrational, Richard Thaler and Cass Sunstein’s Nudge, or Daniel Kahneman’s Thinking, Fast and Slow but was not sure how those insights could be practically used, The Last Mile is full of solid, practical advice on how to put the lessons of behavioral science to work….(More)”