New book edited by Mola, Lapo, Pennarola, Ferdinando, and Za, Stefano: “This book presents a collection of research papers focusing on issues emerging from the interaction of information technologies and organizational systems. In particular, the individual contributions examine digital platforms and artifacts currently adopted in both the business world and society at large (people, communities, firms, governments, etc.). The topics covered include: virtual organizations, virtual communities, smart societies, smart cities, ecological sustainability, e-healthcare, e-government, and interactive policy-making (IPM)…”
A Guide to Making Innovation Offices Work
- Laboratory
- Facilitator
- Advisor
- Technology build-out
- Liaison
- Sponsored offices
Burstein and Black then present examples of each of these structural models.
In addition to describing models for innovation offices, the authors identify issues that government leaders should consider in their decision to create a new innovation office, along with critical success factors for building and sustaining effective innovation offices. The authors emphasize that government leaders should not make the decision to set up an innovation office lightly, and should not create an innovation office for symbolic reasons. Rather, moving forward with setting up a center of gravity for innovation should follow a careful assessment of the mission of the new office, financial resources available, and support from key partners.
This report continues the IBM Center’s long interest in the subject of innovation. The creation of dedicated offices adds a new tool to government in stimulating innovation. Previous IBM Center reports have examined other tools in government’s innovation portfolio, for example:
- Gwanhoo Lee examined federal ideation programs now in place throughout government in which ideas from government employees are sought and processed (Federal Ideation Programs: Challenges and Best Practices).
- Kevin Desouza examined the use of the Challenge.gov platform in which federal government agencies sponsor challenges with financial rewards to find innovative solutions to government problems (Challenge.gov: Using Competitions and Awards to Spur Innovation).
- Sandford Borins examined the use of awards to stimulate innovation in government (The Persistence of Innovation in Government: A Guide for Public Servants).
We hope that government leaders interested in innovation at the federal, state, and local levels will find the models and success factors described in this report helpful as they consider future innovation initiatives or expand upon current innovation activities.”
Training Students to Extract Value from Big Data
The nation’s ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program.
Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council’s Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula…”
Innovation in Philanthropy is not a Hack-a-thon
Sam McAfee in Medium: “…Antiquated funding models and lack of a rapid data-driven evaluation process aren’t the only issues though. Most of the big ideas in the technology-for-social-impact space are focused either on incremental improvements to existing service models, maybe leveraging online services or mobile applications to improve cost-efficiency marginally. Or they solve only a very narrow niche problem for a small audience, often applying a technology that was already in development, and just happened to find a solution in the field.
Innovation Requires Disruption
When you look at innovation in the commercial world, like the Ubers and AirBnBs of the world, what you see is a clear and substantive break from previous modes of thinking about transportation and accommodation. And it’s not the technology itself that is all that impressive. There is nothing ground-breaking technically under the hood of either of those products that wasn’t already lying around for a decade. What makes them different is that they created business models that stepped completely out of the existing taxi and hotel verticals, and simply used technology to leverage existing frustrations with those antiquated models and harness latent demands, to produce a new, vibrant commercial ecosystem.
Now, let’s imagine the same framework in the social sector, where there are equivalent long-standing traditional modes of providing resources. To find new ways of meeting human needs that disrupt those models requires both safe-to-fail experimentation and rapid feedback and iteration in the field, with clear success criteria. Such rapid development can only be accomplished by a sharp, nimble and multifaceted team of thinkers and doers who are passionate about the problem, yes, but also empowered and enabled to break a few institutional eggs on the way to the creative omelet.
Agile and Lean are Proven Methods
It turns out that there are proven working models for cultivating and fostering this kind of innovative thinking and experimentation. As I mentioned above, agile and lean are probably the single greatest contribution to the world by the tech sector, far more impactful than any particular technology produced by it. Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems. They are able to do this precisely because they are unhindered by the hulking bureaucratic structures of the old guard. This is precisely why so many Fortune 500 companies are experimenting with innovation and R&D laboratories. Because they know their existing staff, structures, and processes cannot produce innovation within those constraints. Only the small, nimble teams can do it, and they can only do it if they are kept separate from, protected from even, the traditional production systems of the previous product cycle.
Yet big philanthropy still have barely experimented with this model, only trying it in a few isolated instances. Here at Neo, for example, we are working on a project for teachers funded by a forward-thinking foundation. What our client is trying to disrupt is no less than the entire US education system, and with goals and measurements developed by teachers for teachers, not by Silicon Valley hotshots who have no clue how to fix education.
Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems.
To start with, the project was funded in iterations of six-weeks at a time, each with a distinct and measurable goal. We built a small cross-functional team to tackle some of the tougher issues faced by teachers trying to raise the level of excellence in their classrooms. The team was empowered to talk directly to teachers, and incorporate their feedback into new versions of the project, released on almost a daily basis. We have iterated the design more than sixteen times in less then four months, and it’s starting to really take shape.
We have no idea whether this particular project will be successful in the long run. But what we do know is that the client and their funder have had the courage to step out of the traditional project funding models and apply agile and lean thinking to a very tough problem. And we’re proud to be invited along for the ride.
The vast majority of the social sector is still trying to tackle social problems with program and funding models that were pioneered early in the last century. Agile and lean methods hold the key to finally breaking the mold of the old, traditional model of resourcing social change initiatives. The philanthropic community should be interested in the agile and lean methods produced by the technology sector, not the money produced by it, and start reorganizing project teams and resource allocation strategies and timelines in line this proven innovation model.
Only then we will be in a position to really innovate for social change.”
Government CX: Where Do You Find the Right Foundational Metrics?
Stephanie Thum at Digital Gov: “Customer service. Customer satisfaction. Improving the customer experience.
These buzzwords have become well-trodden territory among government strategists as a new wave of agencies attempt to ignite—or reignite—a focus on customers.
Of course, putting customers first is a worthy goal. But what, exactly, do we mean when we use words like “service” and “satisfaction”? These terms are easily understood in the abstract; however, precisely because of their broad, abstract nature, they can also become roadblocks for pinpointing the specific metrics—and sparking the right strategic conversations—that lead to true customer-oriented improvements.
To find the right foundational customer metrics, begin by looking at your agency’s strategic plan. Examine the publicly-stated goals that guide the entire organization. At Export-Import Bank (Ex-Im Bank), for example, one of our strategic goals is to improve the ease of doing business for customers. Because of this, the Customer Effort Score has become a key external measurement for the Bank in determining customers’ perceptions about our performance toward that goal. Our surveys ask customers: “How much effort did you personally have to put forth to complete your transaction with Ex-Im Bank?” Results are then shared, along with other, supplementary, survey results, within the Bank….”
The Web Observatory: A Middle Layer for Broad Data
New paper by Tiropanis Thanassis, Hall Wendy, Hendler James, and de Larrinaga Christian in Big Data: “The Web Observatory project1 is a global effort that is being led by the Web Science Trust,2 its network of WSTnet laboratories, and the wider Web Science community. The goal of this project is to create a global distributed infrastructure that will foster communities exchanging and using each other’s web-related datasets as well as sharing analytic applications for research and business web applications.3 It will provide the means to observe the digital planet, explore its processes, and understand their impact on different sectors of human activity.
The project is creating a network of separate web observatories, collections of datasets and tools for analyzing data about the Web and its use, each with their own use community. This allows researchers across the world to develop and share data, analytic approaches, publications related to their datasets, and tools (Fig. 1). The network of web observatories aims to bridge the gap that currently exists between big data analytics and the rapidly growing web of “broad data,”4 making it difficult for a large number of people to engage with them….”
New Data for a New Energy Future
(This post originally appeared on the blog of the U.S. Chamber of Commerce Foundation.)
Two growing concerns—climate change and U.S. energy self-sufficiency—have accelerated the search for affordable, sustainable approaches to energy production and use. In this area, as in many others, data-driven innovation is a key to progress. Data scientists are working to help improve energy efficiency and make new forms of energy more economically viable, and are building new, profitable businesses in the process.
In the same way that government data has been used by other kinds of new businesses, the Department of Energy is releasing data that can help energy innovators. At a recent “Energy Datapalooza” held by the department, John Podesta, counselor to the President, summed up the rationale: “Just as climate data will be central to helping communities prepare for climate change, energy data can help us reduce the harmful emissions that are driving climate change.” With electric power accounting for one-third of greenhouse gas emissions in the United States, the opportunities for improvement are great.
The GovLab has been studying the business applications of public government data, or “open data,” for the past year. The resulting study, the Open Data 500, now provides structured, searchable information on more than 500 companies that use open government data as a key business driver. A review of those results shows four major areas where open data is creating new business opportunities in energy and is likely to build many more in the near future.
Commercial building efficiency
Commercial buildings are major energy consumers, and energy costs are a significant business expense. Despite programs like LEED Certification, many commercial buildings waste large amounts of energy. Now a company called FirstFuel, based in Boston, is using open data to drive energy efficiency in these buildings. At the Energy Datapalooza, Swap Shah, the company’s CEO, described how analyzing energy data together with geospatial, weather, and other open data can give a very accurate view of a building’s energy consumption and ways to reduce it. (Sometimes the solution is startlingly simple: According to Shah, the largest source of waste is running heating and cooling systems at the same time.) Other companies are taking on the same kind of task – like Lucid, which provides an operating system that can track a building’s energy use in an integrated way.
Home energy use
A number of companies are finding data-driven solutions for homeowners who want to save money by reducing their energy usage. A key to success is putting together measurements of energy use in the home with public data on energy efficiency solutions. PlotWatt, for example, promises to help consumers “save money with real-time energy tracking” through the data it provides. One of the best-known companies in this area, Opower, uses a psychological strategy: it simultaneously gives people access to their own energy data and lets them compare their energy use to their neighbors’ as an incentive to save. Opower partners with utilities to provide this information, and the Virginia-based company has been successful enough to open offices in San Francisco, London, and Singapore. Soon more and more people will have access to data on their home energy use: Green Button, a government-promoted program implemented by utilities, now gives about 100 million Americans data about their energy consumption.
Solar power and renewable energy
As solar power becomes more efficient and affordable, a number of companies are emerging to support this energy technology. Clean Power Finance, for example, uses its database to connect solar entrepreneurs with sources of capital. In a different way, a company called Solar Census is analyzing publicly available data to find exactly where solar power can be produced most efficiently. The kind of analysis that used to require an on-site survey over several days can now be done in less than a minute with their algorithms.
Other kinds of geospatial and weather data can support other forms of renewable energy. The data will make it easier to find good sites for wind power stations, water sources for small-scale hydroelectric projects, and the best opportunities to tap geothermal energy.
Supporting new energy-efficient vehicles
The Tesla and other electric vehicles are becoming commercially viable, and we will soon see even more efficient vehicles on the road. Toyota has announced that its first fuel-cell cars, which run on hydrogen, will be commercially available by mid-2015, and other auto manufacturers have announced plans to develop fuel-cell vehicles as well. But these vehicles can’t operate without a network to supply power, be it electricity for a Tesla battery or hydrogen for a fuel cell.
It’s a chicken-and-egg problem: People won’t buy large numbers of electric or fuel-cell cars unless they know they can power them, and power stations will be scarce until there are enough vehicles to support their business. Now some new companies are facilitating this transition by giving drivers data-driven tools to find and use the power sources they need. Recargo, for example, provides tools to help electric car owners find charging stations and operate their vehicles.
The development of new energy sources will involve solving social, political, economic, and technological issues. Data science can help develop solutions and bring us more quickly to a new kind of energy future.
Joel Gurin, senior advisor at the GovLab and project director, Open Data 500. He also currently serves as a fellow of the U.S. Chamber of Commerce Foundation.
Driving Innovation With Open Data
Research Article by The GovLab’s Joel Gurin (Chapter 6 in the report, “The Future of Data-Driven Innovation.”): The chapters in this report provide ample evidence of the power of data and its business potential. But like any business resource, data is only valuable if the benefit of using it outweighs its cost. Data collection, management, distribution, quality control, and application all come at a price—a potential obstacle for companies of any size, though especially for small and medium-sized enterprises.
Over the last several years, however, the “I” of data’s return on investment (ROI) has become less of a hurdle, and new data-driven companies are developing rapidly as a result. One major reason is that governments at the federal, state, and local level are making more data available at little or no charge for the private sector and the public to use. Governments collect data of all kinds—including scientific, demographic, and financial data—at taxpayer expense.
Now, public sector agencies and departments are increasingly repaying that public investment by making their data available to all for free or at a low cost. This is Open Data. While there are still costs in putting the data to use, the growing availability of this national resource is becoming a significant driver for hundreds of new businesses. This chapter describes the growing potential of Open Data and the data-driven innovation it supports, the types of data and applications that are most promising, and the policies that will encourage innovation going forward. Read and download this article in PDF format. “
Google’s Waze announces government data exchange program with 10 initial partners
For the program, Waze will provide real-time anonymized crowdsourced traffic data to government departments in exchange for information on public projects like construction, road sensors, and pre-planned road closures.
The first 10 partners include:
- Rio de Janeiro, Brazil
- Barcelona, Spain and the Government of Catalonia
- Jakarta, Indonesia
- Tel Aviv, Israel
- San Jose, Costa Rica
- Boston, USA
- State of Florida, USA
- State of Utah, USA
- Los Angeles County
- The New York Police Department (NYPD)
Waze has also signed on five other government partners and has received applications from more than 80 municipal groups. The company ran an initial pilot program in Rio de Janeiro where it partnered with the city’s traffic control center to supplement the department’s sensor data with reports from Waze users.
At an event celebrating the launch, Di-Ann Eisnor, head of Growth at Waze noted that the data exchange will only include public alerts, such as accidents and closures.
“We don’t share anything beyond that, such as where individuals are located and who they are,” she said.
Eisnor also made it clear that Waze isn’t selling the data. GPS maker TomTom came under fire several years ago after customers learned that the company had sold their data to police departments to help find the best places to put speed traps.
“We keep [the data] clean by making sure we don’t have a business model around it,” Eisnor added.
Waze will requires that new Connected Citizens partners “prove their dedication to citizen engagement and commit to use Waze data to improve city efficiency.”…”
Open Data as Universal Service. New perspectives in the Information Profession
Paper by L. Fernando Ramos Simón et al in Procedia – Social and Behavioral Sciences: “The Internet provides a global information flow, which improves living conditions in poor countries as well as in rich countries. Owing to its abundance and quality, public information (meteorological, geographic, transport information. and also the content managed in libraries, archives and museums) is an incentive for change, becoming invaluable and accessible to all citizens. However, it is clear that Open Data plays a significant role and provides a business service in the digital economy. Nevertheless, it is unknown how this amount of public data may be introduced as universal service to make it available to all citizens in matters of education, health, culture . In fact, a function or role which has traditionally been assumed by libraries. In addition, information professionals will have to acquire new skills that enable them to assume a new role in the information management: data management (Open Data) and content management (Open Content). Thus, this study analyzes new roles, which will be assumed by new information professionals such as metadata, interoperability, access licenses, information search and retrieval tools and applications for data queries…”