Code for America Summit YouTube: “Michael Flowers, former Chief Analytics Officer for New York City, explores the organizational opportunities and challenges of supporting an outcomes-driven approach to service delivery and problem-solving in government and what he’s learned from doing this in the NYC Mayor’s Office of Data Analytics…”
Brazilian Government Develops Toolkit to Guide Institutions in both Planning and Carrying Out Open Data Initatives
The toolkit focuses on the Plano de Dados Abertos – PDA (Open Data Plan) as the guiding instrument where commitments, agenda and policy implementation cycles in the institution are registered. We believe that making each public agency build it’s own PDA is a way to perpetuate the open data policy, making it a state policy and not just a transitory governmental action.
It is organized to facilitate the implementation of the main activities cycles that must be observed in an institution and provides links and manuals to assist in these activities. Emphasis is given to the actors/roles involved in each step and their responsibilities. It also helps to define a central person to monitor and maintain the PDA. The following diagram summarizes the macro steps of implementing an open data policy in an institution:
How to upgrade democracy for the Internet era
Ted Talk: “Pia Mancini and her colleagues want to upgrade democracy in Argentina and beyond. Through their open-source mobile platform they want to bring citizens inside the legislative process, and run candidates who will listen to what they say.”
HopeLab
Press Release from the Drucker Institute: “Today, we announced that HopeLab is the winner of the 2014 Peter F. Drucker Award for Nonprofit Innovation.
The judges recognized HopeLab for its pioneering work in creating products that help people tap into their innate resilience and respond to life’s adversity in healthy ways….
The judges noted that they were particularly impressed with the way that HopeLab met a key criteria for the award—showing how its programming makes a real difference in the lives of the people it serves.
For example, its Re-Mission video games for adolescents and young adults with cancer address the problem of poor treatment adherence by putting players inside the body to battle the disease with weapons like chemotherapy, antibiotics and the body’s natural defenses. Working with hospitals and clinics, HopeLab has distributed more than 210,000 copies of the game in 81 countries. And research published in the medical journal Pediatrics found that playing Re-Mission significantly improved key behavioral and psychological factors associated with successful cancer treatment. In fact, in the largest randomized controlled study of a video-game intervention ever conducted, participants who were given Re-Mission took their chemotherapy and antibiotics more consistently, showed faster acquisition of cancer-related knowledge and increased their self-efficacy.
Building on the success of this founding project, HopeLab has since launched the Re-Mission 2 online games and mobile app, the Zamzee program to boost physical activity and combat sedentary behavior in children, and a number of other mobile apps and social technologies that support resilience and improve health….”
Hey Uncle Sam, Eat Your Own Dogfood
Government as Platform. In that time, we’ve seen “civic tech” and “open data” gain in popularity and acceptance. The Federal Government has an open data platform, data.gov. And so too do states and municipalities across America. Code for America is the hottest thing around, and the healthcare.gov fiasco landed fixing public technology as a top concern in government. We’ve successfully laid the groundwork for a new kind of government technology. We’re moving towards a day when, rather than building user facing technology, the government opens up interfaces to data that allows the private sector to create applications and websites that consume public data and surface it to users.
It’s been five years since Tim O’Reilly published his screed onHowever, we appear to have stalled out a bit in our progress towards government as platform. It’s incredibly difficult to ingest the data for successful commercial products. The kaleidoscope of data formats in open data portals like data.gov might politely be called ‘obscure’, and perhaps more accurately, ‘perversely unusable’. Some of the data hasn’t been updated since first publication, and is quite positively too stale to use. If documentation exists, most of the time it’s incomprehensible….
What we actually need, is for Uncle Sam to start dogfooding his own open data.
For those of you who aren’t familiar with the term, dogfooding is a slang term used by engineers who are using their own product. So, for example, Google employees use Gmail and Google Drive to organize their own work. This term also applies to engineering teams that consume their public APIs to access internal data. Dogfooding helps teams deeply understand their own work from the same perspective as external users. It also provides a keen incentive to make products work well.
Dogfooding is the golden rule of platforms. And currently, open government portals are flagrantly violating this golden rule. I’ve asked around, and I can’t find a single example of a government entity consuming the data they publish…”
Killer Apps in the Gigabit Age
New Pew report By Lee Rainie, Janna Anderson and Jennifer Connolly: “The age of gigabit connectivity is dawning and will advance in coming years. The only question is how quickly it might become widespread. A gigabit connection can deliver 1,000 megabits of information per second (Mbps). Globally, cloud service provider Akamai reports that the average global connection speed in quarter one of 2014 was 3.9 Mbps, with South Korea reporting the highest average connection speed, 23.6 Mbps and the US at 10.5 Mbps.1
In some respects, gigabit connectivity is not a new development. The US scientific community has been using hyper-fast networks for several years, changing the pace of data sharing and enabling levels of collaboration in scientific disciplines that were unimaginable a generation ago.
Gigabit speeds for the “average Internet user” are just arriving in select areas of the world. In the US, Google ran a competition in 2010 for communities to pitch themselves for the construction of the first Google Fiber network running at 1 gigabit per second—Internet speeds 50-100 times faster than the majority of Americans now enjoy. Kansas City was chosen among 1,100 entrants and residents are now signing up for the service. The firm has announced plans to build a gigabit network in Austin, Texas, and perhaps 34 other communities. In response, AT&T has said it expects to begin building gigabit networks in up to 100 US cities.2 The cities of Chattanooga, Tennessee; Lafayette, Louisiana; and Bristol, Virginia, have super speedy networks, and pockets of gigabit connectivity are in use in parts of Las Vegas, Omaha, Santa Monica, and several Vermont communities.3 There are also other regional efforts: Falcon Broadband in Colorado Springs, Colorado; Brooklyn Fiber in New York; Monkey Brains in San Francisco; MINET Fiber in Oregon; Wicked Fiber in Lawrence, Kansas; and Sonic.net in California, among others.4 NewWave expects to launch gigabit connections in 2015 in Poplar Bluff, Missouri Monroe, Rayville, Delhi; and Tallulah, Louisiana, and Suddenlink Communications has launched Operation GigaSpeed.5
In 2014, Google and Verizon were among the innovators announcing that they are testing the capabilities for currently installed fiber networks to carry data even more efficiently—at 10 gigabits per second—to businesses that handle large amounts of Internet traffic.
To explore the possibilities of the next leap in connectivity we asked thousands of experts and Internet builders to share their thoughts about likely new Internet activities and applications that might emerge in the gigabit age. We call this a canvassing because it is not a representative, randomized survey. Its findings emerge from an “opt in” invitation to experts, many of whom play active roles in Internet evolution as technology builders, researchers, managers, policymakers, marketers, and analysts. We also invited comments from those who have made insightful predictions to our previous queries about the future of the Internet. (For more details, please see the section “About this Canvassing of Experts.”)…”
A taxonomy of crowdsourcing based on task complexity
Paper by Robbie T. Nakatsu et al at the Journal of Information Science: “Although a great many different crowdsourcing approaches are available to those seeking to accomplish individual or organizational tasks, little research attention has yet been given to characterizing how those approaches might be based on task characteristics. To that end, we conducted an extensive review of the crowdsourcing landscape, including a look at what types of taxonomies are currently available. Our review found that no taxonomy explored the multidimensional nature of task complexity. This paper develops a taxonomy whose specific intent is the classification of approaches in terms of the types of tasks for which they are best suited. To develop this task-based taxonomy, we followed an iterative approach that considered over 100 well-known examples of crowdsourcing. The taxonomy considers three dimensions of task complexity: (a) task structure – is the task well-defined, or does it require a more open-ended solution; (2) task interdependence – can the task be solved by an individual, or does it require a community of problem solvers; and (3) task commitment – what level of commitment is expected from crowd members? Based on this taxonomy, we identify seven categories of crowdsourcing and discuss prototypical examples of each approach. Furnished with such an understanding, one should be able to determine which crowdsourcing approach is most suitable for a particular task situation.”
Government CX: Where Do You Find the Right Foundational Metrics?
Stephanie Thum at Digital Gov: “Customer service. Customer satisfaction. Improving the customer experience.
These buzzwords have become well-trodden territory among government strategists as a new wave of agencies attempt to ignite—or reignite—a focus on customers.
Of course, putting customers first is a worthy goal. But what, exactly, do we mean when we use words like “service” and “satisfaction”? These terms are easily understood in the abstract; however, precisely because of their broad, abstract nature, they can also become roadblocks for pinpointing the specific metrics—and sparking the right strategic conversations—that lead to true customer-oriented improvements.
To find the right foundational customer metrics, begin by looking at your agency’s strategic plan. Examine the publicly-stated goals that guide the entire organization. At Export-Import Bank (Ex-Im Bank), for example, one of our strategic goals is to improve the ease of doing business for customers. Because of this, the Customer Effort Score has become a key external measurement for the Bank in determining customers’ perceptions about our performance toward that goal. Our surveys ask customers: “How much effort did you personally have to put forth to complete your transaction with Ex-Im Bank?” Results are then shared, along with other, supplementary, survey results, within the Bank….”
Open Data Beyond the Big City
Mark Headd at PBS MediaShift: “…Open data is the future — of how we govern, of how public services are delivered, of how governments engage with those that they serve. And right now, it is unevenly distributed. I think there is a strong argument to be made that data standards can provide a number of benefits to small and midsized municipal governments and could provide a powerful incentive for these governments to adopt open data.
One way we can use standards to drive the adoption of open data is to partner with companies like Yelp, Zillow, Google and others that can use open data to enhance their services. But how do we get companies with 10s and 100s of millions of users to take an interest in data from smaller municipal governments?
In a word – standards.
Why do we care about cities?
When we talk about open data, it’s important to keep in mind that there is a lot of good work happening at the federal, state and local levels all over the country — plenty of states and even counties doing good things on the open data front, but for me it’s important to evaluate where we are on open data with respect to cities.
States typically occupy a different space in the service delivery ecosystem than cities, and the kinds of data that they typically make available can be vastly different from city data. State capitals are often far removed from our daily lives and we may hear about them only when a budget is adopted or when the state legislature takes up a controversial issue.
In cities, the people that represent and serve us us can be our neighbors — the guy behind you at the car wash, or the woman who’s child is in you son’s preschool class. Cities matter.
As cities go, we need to consider carefully that importance of smaller cities — there are a lot more of them than large cities and a non-trivial number of people live in them….”
Trust: A History
New book by Geoffrey Hosking: “Today there is much talk of a ‘crisis of trust’; a crisis which is almost certainly genuine, but usually misunderstood. Trust: A History offers a new perspective on the ways in which trust and distrust have functioned in past societies, providing an empirical and historical basis against which the present crisis can be examined, and suggesting ways in which the concept of trust can be used as a tool to understand our own and other societies.
Geoffrey Hosking argues that social trust is mediated through symbolic systems, such as religion and money, and the institutions associated with them, such as churches and banks. Historically these institutions have nourished trust, but the resulting trust networks have tended to create quite tough boundaries around themselves, across which distrust is projected against outsiders. Hosking also shows how nation-states have been particularly good at absorbing symbolic systems and generating trust among large numbers of people, while also erecting distinct boundaries around themselves, despite an increasingly global economy. He asserts that in the modern world it has become common to entrust major resources to institutions we know little about, and suggests that we need to learn from historical experience and temper this with more traditional forms of trust, or become an ever more distrustful society, with potentially very destabilising consequences.”