Unleashing the Power of Data to Serve the American People


Memorandum: Unleashing the Power of Data to Serve the American People
To: The American People
From: Dr. DJ Patil, Deputy U.S. CTO for Data Policy and Chief Data Scientist

….While there is a rich history of companies using data to their competitive advantage, the disproportionate beneficiaries of big data and data science have been Internet technologies like social media, search, and e-commerce. Yet transformative uses of data in other spheres are just around the corner. Precision medicine and other forms of smarter health care delivery, individualized education, and the “Internet of Things” (which refers to devices like cars or thermostats communicating with each other using embedded sensors linked through wired and wireless networks) are just a few of the ways in which innovative data science applications will transform our future.

The Obama administration has embraced the use of data to improve the operation of the U.S. government and the interactions that people have with it. On May 9, 2013, President Obama signed Executive Order 13642, which made open and machine-readable data the new default for government information. Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the government, helping make troves of valuable data — data that taxpayers have already paid for — easily accessible to anyone. In fact, I used data made available by the National Oceanic and Atmospheric Administration to improve numerical methods of weather forecasting as part of my doctoral work. So I know firsthand just how valuable this data can be — it helped get me through school!

Given the substantial benefits that responsibly and creatively deployed data can provide to us and our nation, it is essential that we work together to push the frontiers of data science. Given the importance this Administration has placed on data, along with the momentum that has been created, now is a unique time to establish a legacy of data supporting the public good. That is why, after a long time in the private sector, I am returning to the federal government as the Deputy Chief Technology Officer for Data Policy and Chief Data Scientist.

Organizations are increasingly realizing that in order to maximize their benefit from data, they require dedicated leadership with the relevant skills. Many corporations, local governments, federal agencies, and others have already created such a role, which is usually called the Chief Data Officer (CDO) or the Chief Data Scientist (CDS). The role of an organization’s CDO or CDS is to help their organization acquire, process, and leverage data in a timely fashion to create efficiencies, iterate on and develop new products, and navigate the competitive landscape.

The Role of the First-Ever U.S. Chief Data Scientist

Similarly, my role as the U.S. CDS will be to responsibly source, process, and leverage data in a timely fashion to enable transparency, provide security, and foster innovation for the benefit of the American public, in order to maximize the nation’s return on its investment in data.

So what specifically am I here to do? As I start, I plan to focus on these four activities:

…(More)”

Access to Scientific Data in the 21st Century: Rationale and Illustrative Usage Rights Review


Paper by James Campbell  in Data Science Journal: “Making scientific data openly accessible and available for re-use is desirable to encourage validation of research results and/or economic development. Understanding what users may, or may not, do with data in online data repositories is key to maximizing the benefits of scientific data re-use. Many online repositories that allow access to scientific data indicate that data is “open,” yet specific usage conditions reviewed on 40 “open” sites suggest that there is no agreed upon understanding of what “open” means with respect to data. This inconsistency can be an impediment to data re-use by researchers and the public. (More)”

Big Data Now


at Radar – O’Reilly: “In the four years we’ve been producing Big Data Now, our wrap-up of important developments in the big data field, we’ve seen tools and applications mature, multiply, and coalesce into new categories. This year’s free wrap-up of Radar coverage is organized around seven themes:

  • Cognitive augmentation: As data processing and data analytics become more accessible, jobs that can be automated will go away. But to be clear, there are still many tasks where the combination of humans and machines produce superior results.
  • Intelligence matters: Artificial intelligence is now playing a bigger and bigger role in everyone’s lives, from sorting our email to rerouting our morning commutes, from detecting fraud in financial markets to predicting dangerous chemical spills. The computing power and algorithmic building blocks to put AI to work have never been more accessible.
  • The convergence of cheap sensors, fast networks, and distributed computation: The amount of quantified data available is increasing exponentially — and aside from tools for centrally handling huge volumes of time-series data as it arrives, devices and software are getting smarter about placing their own data accurately in context, extrapolating without needing to ‘check in’ constantly.
  • Reproducing, managing, and maintaining data pipelines: The coordination of processes and personnel within organizations to gather, store, analyze, and make use of data.
  • The evolving, maturing marketplace of big data components: Open-source components like Spark, Kafka, Cassandra, and ElasticSearch are reducing the need for companies to build in-house proprietary systems. On the other hand, vendors are developing industry-specific suites and applications optimized for the unique needs and data sources in a field.
  • The value of applying techniques from design and social science: While data science knows human behavior in the aggregate, design works in the particular, where A/B testing won’t apply — you only get one shot to communicate your proposal to a CEO, for example. Similarly, social science enables extrapolation from sparse data. Both sets of tools enable you to ask the right questions, and scope your problems and solutions realistically.
  • The importance of building a data culture: An organization that is comfortable with gathering data, curious about its significance, and willing to act on its results will perform demonstrably better than one that doesn’t. These priorities must be shared throughout the business.
  • The perils of big data: From poor analysis (driven by false correlation or lack of domain expertise) to intrusiveness (privacy invasion, price profiling, self-fulfilling predictions), big data has negative potential.

Download our free snapshot of big data in 2014, and follow the story this year on Radar.”

Mastering ’Metrics: The Path from Cause to Effect


Book by Joshua D. Angrist & Jörn-Steffen Pischke : “Applied econometrics, known to aficionados as ‘metrics, is the original data science. ‘Metrics encompasses the statistical methods economists use to untangle cause and effect in human affairs. Through accessible discussion and with a dose of kung fu–themed humor, Mastering ‘Metrics presents the essential tools of econometric research and demonstrates why econometrics is exciting and useful.
The five most valuable econometric methods, or what the authors call the Furious Five–random assignment, regression, instrumental variables, regression discontinuity designs, and differences in differences–are illustrated through well-crafted real-world examples (vetted for awesomeness by Kung Fu Panda’s Jade Palace). Does health insurance make you healthier? Randomized experiments provide answers. Are expensive private colleges and selective public high schools better than more pedestrian institutions? Regression analysis and a regression discontinuity design reveal the surprising truth. When private banks teeter, and depositors take their money and run, should central banks step in to save them? Differences-in-differences analysis of a Depression-era banking crisis offers a response. Could arresting O. J. Simpson have saved his ex-wife’s life? Instrumental variables methods instruct law enforcement authorities in how best to respond to domestic abuse….(More).”

Selected Readings on Cities and Civic Technology


By Julia Root and Stefaan Verhulst

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of civic innovation was originally published in 2014.

The last five years have seen a wave of new organizations, entrepreneurs and investment in cities and the field of civic innovation.  Two subfields, Civic Tech and Government Innovation, are particularly aligned with GovLab’s interest in the ways in which technology is and can be deployed to redesign public institutions and re-imagine governance.

The emerging field of civic technology, or “Civic Tech,” champions new digital platforms, open data and collaboration tools for transforming government service delivery and engagement with citizens. Government Innovation, while not a new field, has seen in the last five years a proliferation of new structures (e.g. Mayor’s Office of New Urban Mechanics), roles (e.g. Chief Technology/Innovation Officer) and public/private investment (e.g. Innovation Delivery Teams and Code for America Fellows) that are building a world-wide movement for transforming how government thinks about and designs services for its citizens.

There is no set definition for “civic innovation.” However, broadly speaking, it is about improving our cities through the implementation of tools, ideas and engagement methods that strengthen the relationship between government and citizens. The civic innovation field encompasses diverse actors from across the public, private and nonprofit spectrums. These can include government leaders, nonprofit and foundation professionals, urbanists, technologists, researchers, business leaders and community organizers, each of whom may use the term in a different way, but ultimately are seeking to disrupt how cities and public institutions solve problems and invest in solutions.

Selected Reading List (in alphabetical order)

Annotated Selected Readings (in alphabetical order)

Books

Goldsmith, Stephen, and Susan Crawford. The Responsive City: Engaging Communities Through Data-Smart Governance. 1 edition. San Francisco, CA: Jossey-Bass, 2014. http://bit.ly/1zvKOL0.

  • The Responsive City, a guide to civic engagement and governance in the digital age, is the culmination of research originating from the Data-Smart City Solutions initiative, an ongoing project at Harvard Kennedy School working to catalyze adoption of data projects on the city level.
  • The “data smart city” is one that is responsive to citizens, engages them in problem solving and finds new innovative solutions for dismantling entrenched bureaucracy.
  • The authors document case studies from New York City, Boston and Chicago to explore the following topics:
    • Building trust in the public sector and fostering a sustained, collective voice among communities;
    • Using data-smart governance to preempt and predict problems while improving quality of life;
    • Creating efficiencies and saving taxpayer money with digital tools; and
    • Spearheading these new approaches to government with innovative leadership.

Townsend, Anthony M. Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia. 1 edition. New York: W. W. Norton & Company, 2013. http://bit.ly/17Y4G0R.

  • In this book, Townsend illustrates how “cities worldwide are deploying technology to address both the timeless challenges of government and the mounting problems posed by human settlements of previously unimaginable size and complexity.”
  • He also considers “the motivations, aspirations, and shortcomings” of the many stakeholders involved in the development of smart cities, and poses a new civics to guide these efforts.
  • He argues that smart cities are not made smart by various, soon-to-be-obsolete technologies built into its infrastructure; instead, it is how citizens are using ever-changing and grassroots technologies to be “human-centered, inclusive and resilient” that will make cities ‘smart.’

Reports + Journal Articles

Black, Alissa, and Rachel Burstein. “The 2050 City – What Civic Innovation Looks Like Today and Tomorrow.” White Paper. New America Foundation – California Civic Innovation Project, June 2013. https://bit.ly/2GohMvw.

  • Through their interviews, the authors determine that civic innovation is not just a “compilation of projects” but that it can inspire institutional structural change.
  • Civic innovation projects that have a “technology focus can sound very different than process-related innovations”; however the outcomes are actually quite similar as they disrupt how citizens and government engage with one another.
  • Technology is viewed by some of the experts as an enabler of civic innovation – not necessarily the driver for innovation itself. What constitutes innovation is how new tools are implemented by government or by civic groups that changes the governing dynamic.

Patel, Mayur, Jon Sotsky, Sean Gourley, and Daniel Houghton. “Knight Foundation Report on Civic Technology.” Presentation. Knight Foundation, December 2013. http://slidesha.re/11UYgO0.

  • This reports aims to advance the field of civic technology, which compared to the tech industry as a whole is relatively young. It maps the field, creating a starting place for understanding activity and investment in the sector.
  • It defines two themes, Open Government and Civic Action, and identifies 11 clusters of civic tech innovation that fall into the two themes. For each cluster, the authors describe the type of activities and highlights specific organizations.
  • The report identified more than $430 million of private and philanthropic investment directed to 102 civic tech organizations from January 2011 to May 2013.

Open Plans. “Field Scan on Civic Technology.” Living Cities, November 2012. http://bit.ly/1HGjGih.

  • Commissioned by Living Cities and authored by Open Plans, the Field Scan investigates the emergent field of civic technology and generates the first analysis of the potential impact for the field as well as a critique for how tools and new methods need to be more inclusive of low-income communities in their use and implementation.
  • Respondents generally agreed that the tools developed and in use in cities so far are demonstrations of the potential power of civic tech, but that these tools don’t yet go far enough.
  • Civic tech tools have the potential to improve the lives of low-income people in a number of ways. However, these tools often fail to reach the population they are intended to benefit. To better understand this challenge, civic tech for low-income people must be considered in the broader context of their interactions with technology and with government.
  • Although hackathons are popular, their approach to problem solving is not always driven by community needs, and hackathons often do not produce useful material for governments or citizens in need.

Goldberg, Jeremy M. “Riding the Second Wave of Civic Innovation.” Governing, August 28, 2014. http://bit.ly/1vOKnhJ.

  • In this piece, Goldberg argues that innovation and entrepreneurship in local government increasingly require mobilizing talent from many sectors and skill sets.

Black, Alissa, and Burstein, Rachel. “A Guide for Making Innovation Offices Work.” IBM Center for the Business of Government, October 2014. http://bit.ly/1vOFZP4.

  • In this report, Burstein and Black examine the recent trend toward the creation of innovation offices across the nation at all levels of government to understand the structural models now being used to stimulate innovation—both internally within an agency, and externally for the agency’s partners and communities.
  • The authors conducted interviews with leadership of innovation offices of cities that include Philadelphia, Austin, Kansas City, Chicago, Davis, Memphis and Los Angeles.
  • The report cites examples of offices, generates a typology for the field, links to projects and highlights success factors.

Mulholland, Jessica, and Noelle Knell. “Chief Innovation Officers in State and Local Government (Interactive Map).” Government Technology, March 28, 2014. http://bit.ly/1ycArvX.

  • This article provides an overview of how different cities structure their Chief Innovation Officer positions and provides links to offices, projects and additional editorial content.
  • Some innovation officers find their duties merged with traditional CIO responsibilities, as is the case in Chicago, Philadelphia and New York City. Others, like those in Louisville and Nashville, have titles that reveal a link to their jurisdiction’s economic development endeavors.

Toolkits

Bloomberg Philanthropies. January 2014. “Transform Your City through Innovation: The Innovation Delivery Model for Making It Happen.” New York: Bloomberg Philanthropies. http://bloombg.org/120VrKB.

  • In 2011, Bloomberg Philanthropies funded a three-year innovation capacity program in five major United States cities— Atlanta, Chicago, Louisville, Memphis, and New Orleans – in which cities could hire top-level staff to develop and see through the implementation of solutions to top mayoral priorities such as customer service, murder, homelessness, and economic development, using a sequence of steps.
  • The Innovation Delivery Team Playbook describes the Innovation Delivery Model and describes each aspect of the model from how to hire and structure the team, to how to manage roundtables and run competitions.

Training Students to Extract Value from Big Data


New report by the National Research Council: “As the availability of high-throughput data-collection technologies, such as information-sensing mobile devices, remote sensing, internet log records, and wireless sensor networks has grown, science, engineering, and business have rapidly transitioned from striving to develop information from scant data to a situation in which the challenge is now that the amount of information exceeds a human’s ability to examine, let alone absorb, it. Data sets are increasingly complex, and this potentially increases the problems associated with such concerns as missing information and other quality concerns, data heterogeneity, and differing data formats.
The nation’s ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program.
Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council’s Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula…”

New Data for a New Energy Future


(This post originally appeared on the blog of the U.S. Chamber of Commerce Foundation.)

Two growing concerns—climate change and U.S. energy self-sufficiency—have accelerated the search for affordable, sustainable approaches to energy production and use. In this area, as in many others, data-driven innovation is a key to progress. Data scientists are working to help improve energy efficiency and make new forms of energy more economically viable, and are building new, profitable businesses in the process.
In the same way that government data has been used by other kinds of new businesses, the Department of Energy is releasing data that can help energy innovators. At a recent “Energy Datapalooza” held by the department, John Podesta, counselor to the President, summed up the rationale: “Just as climate data will be central to helping communities prepare for climate change, energy data can help us reduce the harmful emissions that are driving climate change.” With electric power accounting for one-third of greenhouse gas emissions in the United States, the opportunities for improvement are great.
The GovLab has been studying the business applications of public government data, or “open data,” for the past year. The resulting study, the Open Data 500, now provides structured, searchable information on more than 500 companies that use open government data as a key business driver. A review of those results shows four major areas where open data is creating new business opportunities in energy and is likely to build many more in the near future.

Commercial building efficiency
Commercial buildings are major energy consumers, and energy costs are a significant business expense. Despite programs like LEED Certification, many commercial buildings waste large amounts of energy. Now a company called FirstFuel, based in Boston, is using open data to drive energy efficiency in these buildings. At the Energy Datapalooza, Swap Shah, the company’s CEO, described how analyzing energy data together with geospatial, weather, and other open data can give a very accurate view of a building’s energy consumption and ways to reduce it. (Sometimes the solution is startlingly simple: According to Shah, the largest source of waste is running heating and cooling systems at the same time.) Other companies are taking on the same kind of task – like Lucid, which provides an operating system that can track a building’s energy use in an integrated way.

Home energy use
A number of companies are finding data-driven solutions for homeowners who want to save money by reducing their energy usage. A key to success is putting together measurements of energy use in the home with public data on energy efficiency solutions. PlotWatt, for example, promises to help consumers “save money with real-time energy tracking” through the data it provides. One of the best-known companies in this area, Opower, uses a psychological strategy: it simultaneously gives people access to their own energy data and lets them compare their energy use to their neighbors’ as an incentive to save. Opower partners with utilities to provide this information, and the Virginia-based company has been successful enough to open offices in San Francisco, London, and Singapore. Soon more and more people will have access to data on their home energy use: Green Button, a government-promoted program implemented by utilities, now gives about 100 million Americans data about their energy consumption.

Solar power and renewable energy
As solar power becomes more efficient and affordable, a number of companies are emerging to support this energy technology. Clean Power Finance, for example, uses its database to connect solar entrepreneurs with sources of capital. In a different way, a company called Solar Census is analyzing publicly available data to find exactly where solar power can be produced most efficiently. The kind of analysis that used to require an on-site survey over several days can now be done in less than a minute with their algorithms.
Other kinds of geospatial and weather data can support other forms of renewable energy. The data will make it easier to find good sites for wind power stations, water sources for small-scale hydroelectric projects, and the best opportunities to tap geothermal energy.

Supporting new energy-efficient vehicles
The Tesla and other electric vehicles are becoming commercially viable, and we will soon see even more efficient vehicles on the road. Toyota has announced that its first fuel-cell cars, which run on hydrogen, will be commercially available by mid-2015, and other auto manufacturers have announced plans to develop fuel-cell vehicles as well. But these vehicles can’t operate without a network to supply power, be it electricity for a Tesla battery or hydrogen for a fuel cell.
It’s a chicken-and-egg problem: People won’t buy large numbers of electric or fuel-cell cars unless they know they can power them, and power stations will be scarce until there are enough vehicles to support their business. Now some new companies are facilitating this transition by giving drivers data-driven tools to find and use the power sources they need. Recargo, for example, provides tools to help electric car owners find charging stations and operate their vehicles.
The development of new energy sources will involve solving social, political, economic, and technological issues. Data science can help develop solutions and bring us more quickly to a new kind of energy future.
Joel Gurin, senior advisor at the GovLab and project director, Open Data 500. He also currently serves as a fellow of the U.S. Chamber of Commerce Foundation.

DrivenData


DrivenData Blog: “As we begin launching our first competitions, we thought it would be a good idea to lay out what exactly we’re trying to do and why….
At DrivenData, we want to bring cutting-edge practices in data science and crowdsourcing to some of the world’s biggest social challenges and the organizations taking them on. We host online challenges, usually lasting 2-3 months, where a global community of data scientists competes to come up with the best statistical model for difficult predictive problems that make a difference.
Just like every major corporation today, nonprofits and NGOs have more data than ever before. And just like those corporations, they are trying to figure out how to make the best use of their data. We work with mission-driven organizations to identify specific predictive questions that they care about answering and can use their data to tackle.
Then we host the online competitions, where experts from around the world vie to come up with the best solution. Some competitors are experienced data scientists in the private sector, analyzing corporate data by day, saving the world by night, and testing their mettle on complex questions of impact. Others are smart, sophisticated students and researchers looking to hone their skills on real-world datasets and real-world problems. Still more have extensive experience with social sector data and want to bring their expertise to bear on new, meaningful challenges – with immediate feedback on how well their solution performs.
Like any data competition platform, we want to harness the power of crowds combined with the increasing prevalence of large, relevant datasets. Unlike other data competition platforms, our primary goal is to create actual, measurable, lasting positive change in the world with our competitions. At the end of each challenge, we work with the sponsoring organization to integrate the winning solutions, giving them the tools to drive real improvements in their impact….
We are launching soon and we want you to join us!
If you want to get updates about our launch this fall with exciting, real competitions, please sign up for our mailing list here and follow us on Twitter: @drivendataorg.
If you are a data scientist, feel free to create an account and start playing with our first sandbox competitions.
If you are a nonprofit or public sector organization, and want to squeeze every drop of mission effectiveness out of your data, check out the info on our site and let us know! “

What Is Big Data?


datascience@berkeley Blog: ““Big Data.” It seems like the phrase is everywhere. The term was added to the Oxford English Dictionary in 2013 External link, appeared in Merriam-Webster’s Collegiate Dictionary by 2014 External link, and Gartner’s just-released 2014 Hype Cycle External link shows “Big Data” passing the “Peak of Inflated Expectations” and on its way down into the “Trough of Disillusionment.” Big Data is all the rage. But what does it actually mean?
A commonly repeated definition External link cites the three Vs: volume, velocity, and variety. But others argue that it’s not the size of data that counts, but the tools being used, or the insights that can be drawn from a dataset.
To settle the question once and for all, we asked 40+ thought leaders in publishing, fashion, food, automobiles, medicine, marketing and every industry in between how exactly they would define the phrase “Big Data.” Their answers might surprise you! Take a look below to find out what big data is:

  1. John Akred, Founder and CTO, Silicon Valley Data Science
  2. Philip Ashlock, Chief Architect of Data.gov
  3. Jon Bruner, Editor-at-Large, O’Reilly Media
  4. Reid Bryant, Data Scientist, Brooks Bell
  5. Mike Cavaretta, Data Scientist and Manager, Ford Motor Company
  6. Drew Conway, Head of Data, Project Florida
  7. Rohan Deuskar, CEO and Co-Founder, Stylitics
  8. Amy Escobar, Data Scientist, 2U
  9. Josh Ferguson, Chief Technology Officer, Mode Analytics
  10. John Foreman, Chief Data Scientist, MailChimp

FULL LIST at datascience@berkeley Blog”

Riding the Second Wave of Civic Innovation


Jeremy Goldberg at Governing: “Innovation and entrepreneurship in local government increasingly require mobilizing talent from many sectors and skill sets. Fortunately, the opportunities for nurturing cross-pollination between the public and private sectors have never been greater, thanks in large part to the growing role of organizations such as Bayes Impact, Code for America, Data Science for Social Good and Fuse Corps.
Indeed, there’s reason to believe that we might be entering an even more exciting period of public-private collaboration. As one local-government leader recently put it to me when talking about the critical mass of pro-bono civic-innovation efforts taking place across the San Francisco Bay area, “We’re now riding the second wave of civic pro-bono and civic innovation.”
As an alumni of Fuse Corps’ executive fellows program, I’m convinced that the opportunities initiated by it and similar organizations are integral to civic innovation. Fuse Corps brings civic entrepreneurs with experience across the public, private and nonprofit sectors to work closely with government employees to help them negotiate project design, facilitation and management hurdles. The organization’s leadership training emphasizes “smallifying” — building innovation capacity by breaking big challenges down into smaller tasks in a shorter timeframe — and making “little bets” — low-risk actions aimed at developing and testing an idea.
Since 2012, I have managed programs and cross-sector networks for the Silicon Valley Talent Partnership. I’ve witnessed a groundswell of civic entrepreneurs from across the region stepping up to participate in discussions and launch rapid-prototyping labs focused on civic innovation.
Cities across the nation are creating new roles and programs to engage these civic start-ups. They’re learning that what makes these projects, and specifically civic pro-bono programs, work best is a process of designing, building, operationalizing and bringing them to scale. If you’re setting out to create such a program, here’s a short list of best practices:
Assets: Explore existing internal resources and knowledge to understand the history, departmental relationships and overall functions of the relevant agencies or departments. Develop a compendium of current service/volunteer programs.
City policies/legal framework: Determine what the city charter, city attorney’s office or employee-relations rules and policies say about procurement, collective bargaining and public-private partnerships.
Leadership: The support of the city’s top leadership is especially important during the formative stages of a civic-innovation program, so it’s important to understand how the city’s form of government will impact the program. For example, in a “strong mayor” government the ability to make definitive decisions on a public-private collaboration may be unlikely to face the same scrutiny as it might under a “council/mayor” government.
Cross-departmental collaboration: This is essential. Without the support of city staff across departments, innovation projects are unlikely to take off. Convening a “tiger team” of individuals who are early adopters of such initiatives is important step. Ultimately, city staffers best understand the needs and demands of their departments or agencies.
Partners from corporations and philanthropy: Leveraging existing partnerships will help to bring together an advisory group of cross-sector leaders and executives to participate in the early stages of program development.
Business and member associations: For the Silicon Valley Talent Partnership, the Silicon Valley Leadership Group has been instrumental in advocating for pro-bono volunteerism with the cities of Fremont, San Jose and Santa Clara….”