Strengthening the Connective Links in Government


John M. Kamensky at the IBM Center for The Business of Government: “Over the past five years, the Obama administration has pursued a host of innovation-fostering initiatives that work to strengthen the connective links among and within federal agencies.

Many factors contribute to the rise of such efforts, including presidential support, statutory encouragement, and an ongoing evolution in the way government does its business. The challenge now is how to solidify the best of them so they remain in place beyond the upcoming 2017 presidential transition.

Increased Use of Collaborative Governance

Dr. Rosemary O’Leary, an astute observer of trends in government, describes how government has steadily increased its use of collaborative approaches in lieu of the traditional hierarchical, bureaucratic approach. According to O’Leary, there are several explanations for this shift:

  • First, “most public challenges are larger than one organization, requiring new approaches to addressing public issues” such as housing, pollution, transportation, and healthcare.
  • Second, collaboration helps to improve the effectiveness and performance of programs “by encouraging new ways of providing services.”
  • Third, technology advances in recent years have helped “organizations and their employees to share information in a way that is integrative and interoperable.”
  • Finally, “citizens are seeking additional avenues for engaging in governance, resulting in new and different forms of collaborative problem solving and decision making.”

Early in his administration, President Barack Obama publicly placed a premium on the use of collaboration. One of his first directives to federal agencies set the tone for how he envisioned his administration would govern, directing agencies to be “collaborative” and “use innovative tools, methods, and systems to cooperate among themselves, across levels of government, and with nonprofits, businesses and individuals.” To that end, the Obama administration undertook a series of supporting actions, including establishing crossagency priority goals around issues such as reducing veteran homelessness, data sharing, and streamlining the sharing of social media licenses between agencies. Tackling many of these issues successfully involved the transformative intersection of innovation and technology.

In 2010, when Congress passed a series of amendments to the Government Performance and Results Act (GPRA), it provided the statutory basis for a broader, more consistent use of collaboration as a way of implementing policies and programs. These changes put in place a series of administrative processes:

  • The designation of agency and cross-agency priority goals
  • The naming of goal leaders
  • The convening of a set of regular progress reviews

Taken together, these legislative changes embedded the value of collaboration into the administrative fabric of the governing bureaucracy. In addition, the evolution of technology tools and the advances in the use of social media has dramatically lowered the technical and bureaucratic barriers to working in a more collaborative environment….(More)”

Slowly but surely, government IT enters the 21st century


Jon Brodkin at Ars Technica: “Government IT departments have a mostly deserved reputation for being behind the times. While private companies keep giving customers new and better ways to buy products and learn about their services, government agencies have generally made it difficult for residents to interact with them via the Internet.

But this is slowly changing, with agencies from the local level to the federal level focusing on fixing broken websites and building new tools for Americans to get what they need from the government….

“Improve Detroit,” a smartphone app launched in April this year using technology from SeeClickFix, has helped Detroiters find out how to get things done. In its first six months of availability, 10,000 complaints were resolved in an average of nine days, “a vast improvement from when problems often languished for years,” the city said in an announcement this month.

Improve Detroit was used to get “more than 3,000 illegal dumping sites cleaned up; 2,092 potholes repaired; 991 complaints resolved related to running water in an abandoned structure; 565 abandoned vehicles removed; 506 water main breaks taken care of; [and] 277 traffic signal issues fixed,” Detroit said….

At the municipal level, Oakland is also planning to pass its hard-earned wisdom on to other cities. “Our goal is to create a roadmap for cities big and small,” Oakland Communications Director Karen Boyd told Ars.

Like Detroit, Oakland partnered with SeeClickFix and Code for America after experiencing tough economic times. “Oakland was particularly hard hit by the mortgage crisis [in 2008], a lot of predatory loans were made to our low-income folks,” Boyd said.

Property tax revenue plummeted and the city lost about a quarter of its government workforce, Boyd said.

“Governments were finding themselves way behind the curve on technology. We looked up and realized this was no longer sensible to try to do more with less. We have to do things differently, and technology is an opportunity,” she said.

Working with Code for America in 2013, Oakland made RecordTrac, a website for requesting public records and tracking records requests. Obtaining government documents is often a convoluted process, but Ars Technica’s own Freedom of Information Act enthusiast Cyrus Farivar told me that RecordTrac “is the best (albeit imperfect) public records process I’ve ever used.”…

One of the best examples of a government agency using the Internet to engage residents comes from NASA. The space agency has had an online presence since the early years of the Web, said Brian Dunbar, who has been the content manager for nasa.gov since 1995.

The website has allowed NASA to distribute huge amounts of photos and videos from missions and broadcast an online TV service. There’s even live video feed of the Earth from the International Space Station.

NASA is all over social media, with nearly 700,000 subscribers to its YouTube channel, 13 millionTwitter followers, 13 million Facebook likes, 5.4 million Instagram followers, and a big presence on several other social networks. That’s not even including individuals like astronaut Scott Kelly, who has been tweeting from the International Space Station.

NASA has nearly 100 people editing its website, with content generally capitalizing on current events such as the recent Pluto flyby. NASA gets a lot of feedback when there are video problems, “but we’ve been lucky in that the problems have been not been overwhelming in either number or size, and we get a lot of positive feedback from the public,” Dunbar said.

This is all a natural extension of NASA’s core mission because the legislation that created the agency in 1958 charged it “with disseminating information about its programs to the widest extent practicable,” Dunbar said….(More)”

 

How Big Data Could Open The Financial System For Millions Of People


But that’s changing as the poor start leaving data trails on the Internet and on their cell phones. Now that data can be mined for what it says about someone’s creditworthiness, likeliness to repay, and all that hardcore stuff lenders want to know.

“Every time these individuals make a phone call, send a text, browse the Internet, engage social media networks, or top up their prepaid cards, they deepen the digital footprints they are leaving behind,” says a new report from the Omidyar Network. “These digital footprints are helping to spark a new kind of revolution in lending.”

The report, called “Big Data, Small Credit,” looks at the potential to expand credit access by analyzing mobile and smartphone usage data, utility records, Internet browsing patters and social media behavior….

“In the last few years, a cluster of fast-emerging and innovative firms has begun to use highly predictive technologies and algorithms to interrogate and generate insights from these footprints,” the report says.

“Though these are early days, there is enough to suggest that hundreds of millions of mass-market consumers may not have to remain ‘invisible’ to formal, unsecured credit for much longer.”…(More)

What is Citizensourcing?


Citizensourcing is the crowdsourcing practice applied by governments with the goal of tapping into the collective intelligence of the citizens. Through citizensourcing, governments can collect ideas, suggestions and opinions from their citizens — thereby creating a permanent feedback loop of communication.

Cities are a powerhouse of collective intelligence. Thanks to modern technologies, time has come to unlock the wisdom of the crowd. Tweet: Cities are powerhouses of collective intelligence - time to unlock them. via @citizenlabco http://ctt.ec/7e6Q2+

Yesterday

The current means of engaging citizens in public policy are in place since the 18th century: town hall meetings, in-person visits, phone calls or bureaucratic forms that allowed you to submit an idea. All of those ways of engagement are time-consuming, ineffective and expensive.

Great ideas and valuable feedback get lost, because those forms of engagement take too much effort for both citizens and cities. And next to that, communication happens in private between city government and citizens. Citizens cannot communicate with each other about how they want to improve their city.

Today

Advances in technology have restructured the way societies are organised; we’re living a digital age in which citizens are connected over networks. This creates unseen opportunities for cities to get closer to their citizens and serve them better. In the last years, we’ve seen several cities trying to build a strong online presence on social media channels.

Yet, they have discovered that communicating with their citizens over Twitter and Facebook is far from optimal. Messages get lost in the information overload that characterises those platforms, resulting in a lack of structured communication.

Tomorrow

Imagine that your town hall meetings could be held online… but then 24/7, accessible from every possible device. Citizensourcing on a dedicated platform is an inexpensive way for cities to get valuable input in the form of ideas, feedback and opinions from their citizens.

Whereas only a very small proportion of citizens engage in the time-consuming offline participation, an online platform allows you to multiply your reach by tenfolds. You reach an audience of citizens that you couldn’t reach before, which makes an online platform a well-needed complement for the already existing offline channels in every city.

When citizens can share their ideas in an easy and fun way and get rewarded for their valuable input, that’s when the wisdom of the crowd gets truly unlocked.

The most direct benefit for cities is clear: crowdsourcing new urban ideas drives superior innovations. At least as important as the fact that you offer a new channel for proposals, is that engagement leads to a better understanding of the different needs citizens have…..

There are several early success stories that show the gigantic potential though:

  • The Colombian city Medellín has its own crowdsourcing platform MiMedellín on which citizens share their urban solutions for problems the city faces. It turned out to be a big success: having collected more than 2,300 (!) posted ideas, the government is already developing policies with help from the creativity of citizens.
  • In the Icelandic capital, Reykjavik, the city council succeeded in having their citizensourcing website Better Reykjavik used by over 60% of the citizens. Since Reykjavik implemented their city platform, they have spent €1.9 million on developing more than 200 projectsbased on ideas from citizens..
  • Paris held a participatory budgeting process, called ‘Madame Mayor, I have an idea’, that brought forward wonderful proejcts. To name one, after having received well over 20,000 votes, the city government announced to invest €2 million in vertical garden projects. Other popular ideas included gardens in schools, neighbourhood recycling centers and co-working spaces for students and entrepreneurs….(More)”

Simpler, smarter and innovative public services


Northern Future Forum: “How can governments deliver services better and more efficiently? This is one of the key questions governments all over the world are constantly dealing with. In recent years countries have had to cut back government spending at the same time as demand from citizens for more high quality service is increasing. Public institutions, just as companies, must adapt and develop over time. Rapid technological advancements and societal changes have forced the public sector to reform the way it operates and delivers services. The public sector needs to innovate to adapt and advance in the 21st century.
There are a number of reasons why public sector innovation matters (Potts and Kastelle 2010):

  • The size of the public sector in terms of percentages of GDP makes public sectors large components of the macro economy in many countries. Public sector innovation can affect productivity growth by reducing costs of inputs, better organisation and increasing the value of outputs.
  • The need for evolving policy to match evolving economies.
  • The public sector sets the rules of the game for private sector innovation.

As pointed out there is clearly an imperative to innovate. However, public sector innovation can be difficult, as public services deal with complex problems that have contradictory and diverse demands, need to respond quickly, whilst being transparent and accountable. Public sector innovation has a part to play to grow future economies, but also to develop the solutions to the biggest challenges facing most western nations today. These problems won’t be solved without strong leadership from the public sector and governments of the future. These issues are (Pollitt 2013):

  • Demographic change. The effects ageing of the general population will have on public services.
  • Climate change.
  • Economic trajectories, especially the effects of the current period of austerity.
  • Technological developments.
  • Public trust in government.
  • The changing nature of politics, with declining party loyalty, personalisation of politics, new parties, more media coverage etc.

According to the publications of national governments, the OECD, World Bank and the big international management consultancies, these issues will have major long-term impacts and implications (Pollitt 2013).
The essence of this background paper is to look at how governments can use innovation to help grow the economies and solve some of the biggest challenges of this generation and determine what the essentials to make it happen are. Firstly, a difficult economic environment in many countries tends to constrain the capacity of governments to deliver quality public services. Fiscal pressures, demographic changes, and diverse public and private demands all challenge traditional approaches and call for a rethinking of the way governments operate. There is a growing recognition that the complexity of the challenges facing the public sector cannot be solved by public sector institutions working alone, and that innovative solutions to public challenges require improved internal collaboration, as well as the involvement of external stakeholders partnering with public sector organisations (OECD 2015 a).
Willingness to solve some of these problems is not enough. The system that most western countries have created is in many ways a barrier to innovation. For instance, the public sector can lack innovative leaders and champions (Bason 2010, European Commission 2013), the way money is allocated, and reward and incentive systems can often hinder innovative performance (Kohli and Mulgan 2010), there may be limited knowledge of how to apply innovation processes and methods (European Commission 2013), and departmental silos can create significant challenges to ‘joined up’ problem solving (Carstensen and Bason 2012, Queensland Public Service Commission 2009).
There is not an established definition of innovation in the public sector. However some common elements have emerged from national and international research projects. The OECD has identified the following characteristics of public sector innovation:

  • Novelty: Innovations introduce new approaches, relative to the context where they are introduced.
  • Implementation: Innovations must be implemented, not just an idea.
  • Impact: Innovations aim to result in better public results including efficiency, effectiveness, and user or employee satisfaction.

Public sector innovation does not happen in a vacuum: problems need to be identified; ideas translated into projects which can be tested and then scaled up. For this to happen public sector organisations need to identify the processes and structures which can support and accelerate the innovation activity.
 Figure 1. Key components for successful public sector innovation.
Figure 1. Key components for successful public sector innovation.
The barriers to public sector innovation are in many ways the key to its success. In this background paper four key components for public sector innovation success will be discussed and ways to change them from barriers to supporters of innovation. The framework and the policy levers can play a key role in enabling and sustaining the innovation process:
These levers are:

  • Institutions. Innovation is likely to emerge from the interactions between different bodies.
  • Human Resources. Create ability, motivate and give the right opportunities.
  • Funding. Increase flexibility in allocating and managing financial resources.
  • Regulations. Processes need to be shortened and made more efficient.

Realising the potential of innovation means understanding which factors are most effective in creating the conditions for innovation to flourish, and assessing their relative impact on the capacity and performance of public sector organisations….(More). PDF: Simpler, smarter and innovative public services

Open data, open mind: Why you should share your company data with the world


Mark Samuels at ZDnet: “If information really is the lifeblood of modern organisations, then CIOs could create huge benefits from opening their data to new, creative pairs of eyes. Research from consultant McKinsey suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data: that is, taking previously proprietary data (often starting with public sector data) and opening up access.

So, should your business consider giving outsiders access to insider information? ZDNet speaks to three experts.

More viewpoints can mean better results

Former Tullow Oil CIO Andrew Marks says debates about the potential openness of data in a private sector context are likely to be dominated by one major concern: information security.

“It’s a perfectly reasonable debate until people start thinking about privacy,” he says. “Putting information at risk, both in terms of customer data and competitive advantage, will be a risk too far for many senior executives.”

But what if CIOs could allay c-suite peers’ concerns and create a new opportunity? Marks points to the Goldcorp Challenge, which saw the mining specialist share its proprietary geological data to allow outside experts pick likely spots for mining. The challenge, which included prize money of $575,000 helped identify more than 110 sites, 50 per cent of which were previously unknown to the company. The value of gold found through the competition exceeded $6bn. Marks wonders whether other firms could take similarly brave steps.
“There is a period of time when information is very sensitive,” he says. “Once the value of data starts to become finite, then it might be beneficial for businesses to open the doors and to let outsiders play with the information. That approach, in terms of gamification, might lead to the creation of new ideas and innovations.”…

Marks says these projects help prove that, when it comes to data, more is likely to mean different – and possibly better – results. “Whether using big data algorithms or the human touch, the more viewpoints you bring together, the more you can increases chances of success and reduce risk,” he says.

“There is, therefore, always likely to be value in seeking an alternative perspective. Opening access to data means your firm is going to get more ideas, but CIOs and other senior executives need to think very carefully about what such openness means for the business, and the potential benefits.”….Some leading firms are already taking steps towards openness. Take Christina Scott, chief product and information officer at the Financial Times, who says the media organisation has used data analysts to help push the benefits of information-led insight across the business.

Her team has democratised data in order to make sure that all parts of the organisation can get the information they need to complete their day-to-day jobs. Scott says the approach is best viewed as an open data strategy, but within the safe confines of the existing enterprise firewall. While the tactic is internally focused currently, Scott says the FT is keen to find ways to make the most of external talent in the future.

“We’re starting to consider how we might open data beyond the organisation, too,” she says. “Our data holds a lot of value and insight, including across the metadata we’ve created. So it would be great to think about how we could use that information in a more open way.” Part of the FT’s business includes trade-focused magazines. Scott says opening the data could provide new insight to its B2B customers across a range of sectors. In fact, the firm has already dabbled at a smaller scale.

“We’ve run hackathons, where we’ve exposed our APIs and given people the chance to come up with some new ideas,” she says. “But I don’t think we’ve done as much work on open data as we could. And I think that’s the direction in which better organisations are moving. They recognise that not all innovation is going to happen within the company.”…

CIO Omid Shiraji is another IT expert who recognises that there is a general move towards a more open society. Any executive who expects to work within a tightly defined enterprise firewall is living in cloud cuckoo land, he argues. More to the point, they will miss out on big advantages.
“If you can expose your sources to a range of developers, you can start to benefit from massive innovation,” he says. “You can get really big benefits from opening your data to external experts who can focus on areas that you don’t have the capability to develop internally.”

Many IT leaders would like to open data to outside experts, suggests Shiraji. For CIOs who are keen to expose their sources, he suggests letting small-scale developers take a close look at in-house data silos in an attempt to discover what relationships might exist and what advantages could accrue….(More)”

The big cost of using big data in elections


Michael McDonald, Peter Licari and Lia Merivaki in the Washington Post: “In modern campaigns, buzzwords like “microtargeting” and “big data” are often bandied about as essential to victory. These terms refer to the practice of analyzing (or “microtargeting”) millions of voter registration records (“big data”) to predict who will vote and for whom.

If you’ve ever gotten a message from a campaign, there’s a good chance you’ve been microtargeted. Serious campaigns use microtargeting to persuade voters through mailings, phone calls, knocking on doors, and — in our increasingly connected world — social media.

But the big data that fuels such efforts comes at a big price, which can create a serious barrier to entry for candidates and groups seeking to participate in elections — that is, if they are allowed to buy the data at all.

When we asked state election officials about prices and restrictions on who can use their voter registration files, we learned that the rules are unsettlingly arbitrary.

Contrast Arizona and Washington. Arizona sells its statewide voter file for an estimated $32,500, while Washington gives its file away for free. Before jumping to the conclusion that this is a red- state/blue-state thing, consider that Oklahoma gives its file away, too.

A number of states base their prices on a per-record formula, which can massively drive up the price despite the fact that files are often delivered electronically. Alabama sells its records for 1 cent per voter , which yields an approximately $30,000 charge for the lot. Seriously, in this day and age, who prices an electronic database by the record?

Some states will give more data to candidates than to outside groups. Delaware will provide phone numbers to candidates but not to nonprofit organizations doing nonpartisan voter mobilization.

In some states, the voter file is not even available to the general public. States such as South Carolina and Maryland permit access only to residents who are registered voters. States including Kentucky and North Dakota grant access only to campaigns, parties and other political organizations.

We estimate that it would cost roughly $140,000 for an independent presidential campaign or national nonprofit organization to compile a national voter file, and this would not be a one-time cost. Voter lists frequently change as voters are added and deleted.

Guess who most benefits from all the administrative chaos? Political parties and their candidates. Not only are they capable of raising the vast amounts of money needed to purchase the data, but, adding insult to injury, they sometimes don’t even have to. Some states literally bequeath the data to parties at no cost. Alabama goes so far as to give parties a free statewide copy for every election.

Who is hurt by this? Independent candidates and nonprofit organizations that want to run national campaigns but don’t have deep pockets. If someone like Donald Trump launched an independent presidential run, he could buy the necessary data without much difficulty. But a nonprofit focused on mobilizing low-income voters could be stretched thin….(More)”

Handbook of Digital Politics


Book edited by Stephen Coleman: “Politics continues to evolve in the digital era, spurred in part by the accelerating pace of technological development. This cutting-edge Handbook includes the very latest research on the relationship between digital information, communication technologies and politics.

Written by leading scholars in the field, the chapters explore in seven parts: theories of digital politics, government and policy, collective action and civic engagement, political talk, journalism, internet governance and new frontiers in digital politics research. The contributors focus on the politics behind the implementation of digital technologies in society today.

All students in the fields of politics, media and communication studies, journalism, science and sociology will find this book to be a useful resource in their studies. Political practitioners seeking digital strategies, as well as web and other digital practitioners wanting to know more about political applications for their work will also find this book to be of interest….(More)”

Testing governance: the laboratory lives and methods of policy innovation labs


Ben Williamson at Code Acts in Education: “Digital technologies are increasingly playing a significant role in techniques of governance in sectors such as education as well as healthcare, urban management, and in government innovation and citizen engagement in government services. But these technologies need to be sponsored and advocated by particular individuals and groups before they are embedded in these settings.

Testing governance cover

I have produced a working paper entitled Testing governance: the laboratory lives and methods of policy innovation labs which examines the role of innovation labs as sponsors of new digital technologies of governance. By combining resources and practices from politics, data analysis, media, design, and digital innovation, labs act as experimental R&D labs and practical ideas organizations for solving social and public problems, located in the borderlands between sectors, fields and disciplinary methodologies. Labs are making methods such as data analytics, design thinking and experimentation into a powerful set of governing resources.They are, in other words, making digital methods into key techniques for understanding social and public issues, and in the creation and circulation of solutions to the problems of contemporary governance–in education and elsewhere.

The working paper analyses the key methods and messages of the labs field, in particular by investigating the documentary history of Futurelab, a prototypical lab for education research and innovation that operated in Bristol, UK, between 2002 and 2010, and tracing methodological continuities through the current wave of lab development. Centrally, the working paper explores Futurelab’s contribution to the production and stabilization of a ‘sociotechnical imaginary’ of the future of education specifically, and to the future of public services more generally. It offers some preliminary analysis of how such an imaginary was embedded in the ‘laboratory life’ of Futurelab, established through its organizational networks, and operationalized in its digital methods of research and development as well as its modes of communication….(More)”

In post-earthquake Nepal, open data accountability


Deepa Rai at the Worldbank blog: “….Following the earthquake, there was an overwhelming response from technocrats and data crunchers to use data visualizations for disaster risk assessment. The Government of Nepal made datasets available through its Disaster Data Portal and many organizations and individuals also pitched in and produced visual data platforms.
However, the use of open data has not been limited to disaster response. It was, and still is, instrumental in tracking how much funding has been received and how it’s being allocated. Through the use of open data, people can make their own analysis based on the information provided online.

Direct Relief, a not-for-profit company, has collected such information and helped gathered data from the Prime Minister’s relief fund and then created infographics which have been useful for media and immediate distribution on social platforms. MapJournal’s visual maps became vital during the Post Disaster Needs Assessment (PDNA) to assess and map areas where relief and reconstruction efforts were urgently needed.

Direct Relief Medical Relief partner locations
Direct Relief medical relief partner locations in context of population affected and injuries by district
Photo Credit: Data Relief Services

Open data and accountability
However, the work of open data doesn’t end with relief distribution and disaster risk assessment. It is also hugely impactful in keeping track of how relief money is pledged, allocated, and spent. One such web application,openenet.net is making this possible by aggregating post disaster funding data from international and national sources into infographics. “The objective of the system,” reads the website “is to ensure transparency and accountability of relief funds and resources to ensure that it reaches to targeted beneficiaries. We believe that transparency of funds in an open and accessible manner within a central platform is perhaps the first step to ensure effective mobilization of available resources.”
Four months after the earthquake, Nepali media have already started to report on aid spending — or the lack of it. This has been made possible by the use of open data from the Ministry of Home Affairs (MoHA) and illustrates how critical data is for the effective use of aid money.
Open data platforms emerging after the quakes have been crucial in questioning the accountability of aid provisions and ultimately resulting in more successful development outcomes….(More)”