Your City Needs a Local Data Intermediary Now


Matt Lawyue and Kathryn Pettit at Next City: “Imagine if every community nationwide had access to their own data — data on which children are missing too many days of school, which neighborhoods are becoming unaffordable, or where more mothers are getting better access to prenatal care.

This is a reality in some areas, where neighborhood data is analyzed to evaluate community health and to promote development. Cleveland is studying cases of lead poisoning and the impact on school readiness and educational outcomes for children. Detroit is tracking the extent of property blight and abandonment.

But good data doesn’t just happen.

These activities are possible because of local intermediaries, groups that bridge the gap between data and local stakeholders: nonprofits, government agencies, foundations and residents. These groups access data that are often confidential and indecipherable to the public and make them accessible and useful. And with the support of the National Neighborhood Indicators Partnership (NNIP), groups around the country are championing community development at the local level.

Without a local data intermediary in Baltimore, we might know less about what happened there last year and why.

Freddie Gray’s death prompted intense discussion about police brutality and discrimination against African-Americans. But the Baltimore Neighborhood Indicators Alliance (BNIA) helped root this incident and others like it within a particular place, highlighting what can happen when disadvantage is allowed to accumulate over decades.

BNIA, an NNIP member, was formed in 2000 to help community organizations use data shared by government agencies. By the time of Gray’s death, BNIA had 15 years of data across more than 150 indicators that demonstrated clear socioeconomic disadvantages for residents of Gray’s neighborhood, Sandtown-Winchester. The neighborhood had a 34 percent housing vacancy rate and 23 percent unemployment. The neighborhood lacks highway access and is poorly served by public transit, leaving residents cut off from jobs and services.

With BNIA’s help, national and local media outlets, including the New York Times,MSNBC and the Baltimore Sun portrayed a community beset by concentrated poverty, while other Baltimore neighborhoods benefited from economic investment and rising incomes. BNIA data, which is updated yearly, has also been used to develop policy ideas to revitalize the neighborhood, from increasing the use of housing choice vouchers to tackling unemployment.

Local data intermediaries like BNIA harness neighborhood data to make underserved people and unresolved issues visible. They work with government agencies to access raw data (e.g., crime reports, property records, and vital statistics) and facilitate their use to improve quality of life for residents.

But it’s not easy. Uncovering useful, actionable information requires trust, technical expertise, knowledge of the local context and coordination among multiple stakeholders.

This is why the NNIP is vital. NNIP is a peer network of more than two dozen local data intermediaries and the Urban Institute, working to democratize data by building local capacity and planning joint activities. Before NNIP’s founding partners, there were no advanced information systems documenting and tracking neighborhood indicators. Since 1996, NNIP has been a platform for sharing best practices, providing technical assistance, managing cross-site projects and analysis, and expanding the outreach of local data intermediaries to national networks and federal agencies. The partnership continues to grow. In order to foster this capacity in more places, NNIP has just released a guide for local communities to start a data intermediary….(More)”

Soon Your City Will Know Everything About You


Currently, the biggest users of these sensor arrays are in cities, where city governments use them to collect large amounts of policy-relevant data. In Los Angeles, the crowdsourced traffic and navigation app Waze collects data that helps residents navigate the city’s choked highway networks. In Chicago, an ambitious program makes public data available to startups eager to build apps for residents. The city’s 49th ward has been experimenting with participatory budgeting and online votingto take the pulse of the community on policy issues. Chicago has also been developing the “Array of Things,” a network of sensors that track, among other things, the urban conditions that affect bronchitis.

Edmonton uses the cloud to track the condition of playground equipment. And a growing number of countries have purpose-built smart cities, like South Korea’s high tech utopia city of Songdo, where pervasive sensor networks and ubiquitous computing generate immense amounts of civic data for public services.

The drive for smart cities isn’t restricted to the developed world. Rio de Janeiro coordinates the information flows of 30 different city agencies. In Beijing and Da Nang (Vietnam), mobile phone data is actively tracked in the name of real-time traffic management. Urban sensor networks, in other words, are also developing in countries with few legal protections governing the usage of data.

These services are promising and useful. But you don’t have to look far to see why the Internet of Things has serious privacy implications. Public data is used for “predictive policing” in at least 75 cities across the U.S., including New York City, where critics maintain that using social media or traffic data to help officers evaluate probable cause is a form of digital stop-and-frisk. In Los Angeles, the security firm Palantir scoops up publicly generated data on car movements, merges it with license plate information collected by the city’s traffic cameras, and sells analytics back to the city so that police officers can decide whether or not to search a car. In Chicago, concern is growing about discriminatory profiling because so much information is collected and managed by the police department — an agency with a poor reputation for handling data in consistent and sensitive ways. In 2015, video surveillance of the police shooting Laquan McDonald outside a Burger King was erased by a police employee who ironically did not know his activities were being digitally recorded by cameras inside the restaurant.

Since most national governments have bungled privacy policy, cities — which have a reputation for being better with administrative innovations — will need to fill this gap. A few countries, such as Canada and the U.K., have independent “privacy commissioners” who are responsible for advocating for the public when bureaucracies must decide how to use or give out data. It is pretty clear that cities need such advocates too.

What would Urban Privacy Commissioners do? They would teach the public — and other government staff — about how policy algorithms work. They would evaluate the political context in which city agencies make big data investments. They would help a city negotiate contracts that protect residents’ privacy while providing effective analysis to policy makers and ensuring that open data is consistently serving the public good….(more)”.

The Values of Public Library in Promoting an Open Government Environment


Djoko Sigit Sayogo et al in the Proceedings of the 17th International Digital Government Research Conference on Digital Government Research: “Public participation has been less than ideal in many government-implemented ICT initiatives. Extant studies highlight the importance of public libraries as an intermediary between citizens and government. This study evaluates the role of public libraries as mediating the relationship between citizens and government in support of an open government environment. Using data from a national survey of “Library and Technology Use” conducted by PEW Internet in 2015, we test whether a citizen’s perception of public values provided by public libraries influence the likelihood of the citizen’s engagement within open-government environment contexts. The results signify a significant relationship between certain public values provided by public libraries with the propensity of citizens engaging government in an online environment. Our findings further indicate that varying public values generate different results in regard to the way citizens are stimulated to use public libraries to engage with government online. These findings imply that programs designed and developed to take into account a variety of values are more likely to effectively induce citizen engagement in an open government environment through the mediation of public libraries….(More)”

While governments talk about smart cities, it’s citizens who create them


Carlo Ratti at the Conversation: “The Australian government recently released an ambitious Smart Cities Plan, which suggests that cities should be first and foremost for people:

If our cities are to continue to meet their residents’ needs, it is essential for people to engage and participate in planning and policy decisions that have an impact on their lives.

Such statements are a good starting point – and should probably become central to Australia’s implementation efforts. A lot of knowledge has been collected over the past decade from successful and failed smart cities experiments all over the world; reflecting on them could provide useful information for the Australian government as it launches its national plan.

What is a smart city?

But, before embarking on such review, it would help to start from a definition of “smart city”.

The term has been used and abused in recent years, so much so that today it has lost meaning. It is often used to encompass disparate applications: we hear people talk and write about “smart city” when they refer to anything from citizen engagement to Zipcar, from open data to Airbnb, from smart biking to broadband.

Where to start with a definition? It is a truism to say the internet has transformed our lives over the past 20 years. Everything in the way we work, meet, mate and so on is very different today than it was just a few decades ago, thanks to a network of connectivity that now encompasses most people on the planet.

In a similar way, we are today at the beginning of a new technological revolution: the internet is entering physical space – the very space of our cities – and is becoming the Internet of Things; it is opening the door to a new world of applications that, as with the first wave of the internet, can incorporate many domains….

What should governments do?

In the above technological context, what should governments do? Over the past few years, the first wave of smart city applications followed technological excitement.

For instance, some of Korea’s early experiments such as Songdo City were engineered by the likes of Cisco, with technology deployment assisted by top-down policy directives.

In a similar way, in 2010, Rio de Janeiro launched the Integrated Centre of Command and Control, engineered by IBM. It’s a large control room for the city, which collects real-time information from cameras and myriad sensors suffused in the urban fabric.

Such approaches revealed many shortcomings, most notably the lack of civic engagement. It is as if they thought of the city simply as a “computer in open air”. These approaches led to several backlashes in the research and academic community.

A more interesting lesson can come from the US, where the focus is more on developing a rich Internet of Things innovation ecosystem. There are many initiatives fostering spaces – digital and physical – for people to come together and collaborate on urban and civic innovations….

That isn’t to say that governments should take a completely hands-off approach to urban development. Governments certainly have an important role to play. This includes supporting academic research and promoting applications in fields that might be less appealing to venture capital – unglamorous but nonetheless crucial domains such as municipal waste or water services.

The public sector can also promote the use of open platforms and standards in such projects, which would speed up adoption in cities worldwide.

Still, the overarching goal should always be to focus on citizens. They are in the best position to determine how to transform their cities and to make decisions that will have – as the Australian Smart Cities Plan puts it – “an impact on their lives”….(more)”

Foundation Transparency: Game Over?


Brad Smith at Glass Pockets (Foundation Center): “The tranquil world of America’s foundations is about to be shaken, but if you read the Center for Effective Philanthropy’s (CEP) new study — Sharing What Matters, Foundation Transparency — you would never know it.

Don’t get me wrong. That study, like everything CEP produces, is carefully researched, insightful and thoroughly professional. But it misses the single biggest change in foundation transparency in decades: the imminent release by the Internal Revenue Service of foundation 990-PF (and 990) tax returns as machine-readable open data.

Clara Miller, President of the Heron Foundation, writes eloquently in her manifesto, Building a Foundation for the 21St Century: “…the private foundation model was designed to be protective and separate, much like a terrarium.”

Terrariums, of course, are highly “curated” environments over which their creators have complete control. The CEP study, proves that point, to the extent that much of the study consists of interviews with foundation leaders and reviews of their websites as if transparency were a kind of optional endeavor in which foundations may choose to participate, if at all, and to what degree.

To be fair, CEP also interviewed the grantees of various foundations (sometimes referred to as “partners”), which helps convey the reality that foundations have stakeholders beyond their four walls. However, the terrarium metaphor is about to become far more relevant as the release of 990 tax returns as open data will literally make it possible for anyone to look right through those glass walls to the curated foundation world within.

What Is Open Data?

It is safe to say that most foundation leaders and a fair majority of their staff do not understand what open data really is. Open data is free, yes, but more importantly it is digital and machine-readable. This means it can be consumed in enormous volumes at lightning speed, directly by computers.

Once consumed, open data can be tagged, sorted, indexed and searched using statistical methods to make obvious comparisons while discovering previously undetected correlations. Anyone with a computer, some coding skills and a hard drive or cloud storage can access open data. In today’s world, a lot of people meet those requirements, and they are free to do whatever they please with your information once it is, as open data enthusiasts like to say, “in the wild.”

What is the Internal Revenue Service Releasing?

Thanks to the Aspen Institute’s leadership of a joint effort – funded by foundations and including Foundation Center, GuideStar, the National Center for Charitable Statistics, the Johns Hopkins Center for Civil Society Studies, and others – the IRS has started to make some 1,000,000 Form 990s and 40,000 Form 990PF available as machine-readable open data.

Previously, all Form 990s had been released as image (TIFF) files, essentially a picture, making it both time-consuming and expensive to extract useful data from them. Credit where credit is due; a kick in the butt in the form of a lawsuit from open data crusader Carl Malamud helped speed the process along.

The current test phase includes only those tax returns that were digitally filed by nonprofits and community foundations (990s) and private foundations (990PFs). Over time, the IRS will phase in a mandatory digital filing requirement for all Form 990s, and the intent is to release them all as open data. In other words, that which is born digital will be opened up to the public in digital form. Because of variations in the 990 forms, getting the information from them into a database will still require some technical expertise, but will be far more feasible and faster than ever before.

The Good

The work of organizations like Foundation Center– who have built expensive infrastructure in order to turn years of 990 tax returns into information that can be used by nonprofits looking for funding, researchers trying to understand the role of foundations and foundations, themselves, seeking to benchmark themselves against peers—will be transformed.

Work will shift away from the mechanics of capturing and processing the data to higher level analysis and visualization to stimulate the generation and sharing of new insights and knowledge. This will fuel greater collaboration between peer organizations, innovation, the merging of previous disparate bodies of data, better philanthropy, and a stronger social sector… (more)

 

Value and Vulnerability: The Internet of Things in a Connected State Government


Pressrelease: “The National Association of State Chief Information Officers (NASCIO) today released a policy brief on the Internet of Things (IoT) in state government. The paper focuses on the different ways state governments are using IoT now and in the future and the policy considerations involved.

“In NASCIO’s 2015 State CIO Survey, we asked state CIOs to what extent IoT was on their agenda. Just over half said they were in informal discussions, however only one in five had moved to the formal discussion phase. We believe IoT needs to be a formal part of each state’s policy considerations,” explained NASCIO Executive Director Doug Robinson.

The paper encourages state CIOs to make IoT part of the enterprise architecture discussions on asset management and risk assessment and to develop an IoT roadmap.

“Cities and municipalities have been working toward the designation of ‘smart city’ for a while now,” said Darryl Ackley, cabinet secretary for the New Mexico Department of Information Technology and NASCIO president. “While states provide different services than cities, we are seeing a lot of activity around IoT to improve citizen services and we see great potential for growth. The more organized and methodical states can be about implementing IoT, the more successful and useful the outcomes.”

Read the policy brief at www.NASCIO.org/ValueAndVulnerability 

Is civic technology the killer app for democracy?


 at TechCrunch: “Smartphone apps have improved convenience for public transportation in many urban centers. In Washington, DC, riders can download apps to help them figure out where to go, when to show up and how long to wait for a bus or train. However, the problem with public transport in DC is not the lack of modern, helpful and timely information. The problem is that the Metro subway system is onfire. 

Critical infrastructure refers to the vital systems that connect us. Like the water catastrophe in Flint, Michigan and our crumbling roads, bridges and airports, the Metro system in DC is experiencing a systems failure. The Metro’s problems arise from typical public challenges like  poor management and deferred maintenance.

Upgrades of physical infrastructure are not easy and nimble like a software patch or an agile design process. They are slow, expensive and subject to deliberation and scrutiny. In other words, they are the fundamental substance of democratic decision-making: big decisions with long-term implications that require thoughtful strategy, significant investment, political leadership and public buy-in.

A killer app is an application you love so much you buy into a whole new way of doing things. Email and social media are good examples of killer apps. The killer app for Metro would have to get political leaders to look beyond their narrow, short-term interests and be willing to invest in modern public transportation for our national capital region.

The same is true for fixing our critical infrastructure throughout the nation. The killer apps for the systems on which we rely daily won’t be technical, they will be human. It will be Americans working together to a build a technology-enabled resilient democracy —one that is inclusive, responsive and successful in the Information Age.

In 2007, the I-35 bridge in Minneapolis collapsed into the Mississippi river. During his presidential bid, Senator John McCain used this event as an example of the failure of our leaders to make trade-offs for common national purpose. Case in point, an extravagantly expensive congressionally funded Alaskan “bridge to nowhere” that served just a handful of people on an island. But how many apps to nowhere are we building?.

In DC, commuters who can afford alternatives will leave Metro. They’ll walk, drive, ordera car service or locate a bikeshare. The people who suffer from the public service risk and imbalance of the current Metro system are those who have no choice.

So here’s the challenge: Modern technology needs to create an inclusive society. Our current technical approach too often means that we’re prioritizing progress or profit for the few over the many. This pattern defeats the purpose of both the technology revolution and American democracy. Government and infrastructure are supposed to serve everyone, but technology thus far has made it so that public failures affect some Americans more than others. …

For democracy to succeed in the Information Age, we’ll need some new rules of engagement with technology. The White House recently released its third report on data and its implications for society. The 2016 report pays special attention to the ethics of machine automation and algorithms. The authors stress the importance of ethical analytics and propose the principle of “equal opportunity by design.” It’s an excellent point of departure as we recalibrate old systems and build new bridges to a more resilient, inclusive and prosperous nation….(more)”

Real-Time Data Can Improve Traffic Management in Major Cities


World Bank: “Traffic management agencies and city planners will soon have access to real-time data to better manage traffic flows on the streets of Cebu City and Metro Manila.

Grab, The World Bank, and the Department of Transportation and Communications (DOTC) launched today the OpenTraffic initiative, which will help address traffic congestion and road safety challenges.

Grab is the leading ride-hailing platform in Southeast Asia and operates in 30 cities across six countries – Singapore, Indonesia, Philippines, Malaysia, Thailand, and Vietnam.

Grab and the World Bank have been developing free, open-source tools that translate Grab’s voluminous driver GPS data into traffic statistics, including speeds, flows, and intersection delays. These statistics power big data open source tools such as OpenTraffic, for analysing traffic speeds and flows, and DRIVER, for identifying road incident blackspots and improving emergency response. Grab and the World Bank plan to make OpenTraffic available to other Southeast Asian city governments in the near future.

“Using big data is one of the potential solutions to the challenges faced by our transport systems. Through this we can provide accurate, real-time information for initiatives that can help alleviate traffic congestion and improve road safety,” said DOTC Secretary Joseph Emilio A. Abaya.

Last month, the World Bank and DOTC helped train more than 200 government staff from the agency, the Philippine National Police (PNP), the Metro Manila Development Authority (MMDA), the Department of Public Works and Highways (DPWH), and the Cebu City Transportation Office on the use of the OpenTraffic platform….In the near future, traffic statistics derived through OpenTraffic will be fed into another application called “DRIVER” or Data for Road Incident Visualization, Evaluation, and Reporting for road incident recording and analysis. This application, developed by the World Bank, will help engineering units to prioritize crash-prone areas for interventions and improve emergency response….(More)”

Improving patient care by bridging the divide between doctors and data scientists


 at the Conversation: “While wonderful new medical discoveries and innovations are in the news every day, doctors struggle daily with using information and techniques available right now while carefully adopting new concepts and treatments. As a practicing doctor, I deal with uncertainties and unanswered clinical questions all the time….At the moment, a report from the National Academy of Medicine tells us, most doctors base most of their everyday decisions on guidelines from (sometimes biased) expert opinions or small clinical trials. It would be better if they were from multicenter, large, randomized controlled studies, with tightly controlled conditions ensuring the results are as reliable as possible. However, those are expensive and difficult to perform, and even then often exclude a number of important patient groups on the basis of age, disease and sociological factors.

Part of the problem is that health records are traditionally kept on paper, making them hard to analyze en masse. As a result, most of what medical professionals might have learned from experiences was lost – or at least was inaccessible to another doctor meeting with a similar patient.

A digital system would collect and store as much clinical data as possible from as many patients as possible. It could then use information from the past – such as blood pressure, blood sugar levels, heart rate and other measurements of patients’ body functions – to guide future doctors to the best diagnosis and treatment of similar patients.

Industrial giants such as Google, IBM, SAP and Hewlett-Packard have also recognized the potential for this kind of approach, and are now working on how to leverage population data for the precise medical care of individuals.

Collaborating on data and medicine

At the Laboratory of Computational Physiology at the Massachusetts Institute of Technology, we have begun to collect large amounts of detailed patient data in the Medical Information Mart in Intensive Care (MIMIC). It is a database containing information from 60,000 patient admissions to the intensive care units of the Beth Israel Deaconess Medical Center, a Boston teaching hospital affiliated with Harvard Medical School. The data in MIMIC has been meticulously scoured so individual patients cannot be recognized, and is freely shared online with the research community.

But the database itself is not enough. We bring together front-line clinicians (such as nurses, pharmacists and doctors) to identify questions they want to investigate, and data scientists to conduct the appropriate analyses of the MIMIC records. This gives caregivers and patients the best individualized treatment options in the absence of a randomized controlled trial.

Bringing data analysis to the world

At the same time we are working to bring these data-enabled systems to assist with medical decisions to countries with limited health care resources, where research is considered an expensive luxury. Often these countries have few or no medical records – even on paper – to analyze. We can help them collect health data digitally, creating the potential to significantly improve medical care for their populations.

This task is the focus of Sana, a collection of technical, medical and community experts from across the globe that is also based in our group at MIT. Sana has designed a digital health information system specifically for use by health providers and patients in rural and underserved areas.

At its core is an open-source system that uses cellphones – common even in poor and rural nations – to collect, transmit and store all sorts of medical data. It can handle not only basic patient data such as height and weight, but also photos and X-rays, ultrasound videos, and electrical signals from a patient’s brain (EEG) and heart (ECG).

Partnering with universities and health organizations, Sana organizes training sessions (which we call “bootcamps”) and collaborative workshops (called “hackathons”) to connect nurses, doctors and community health workers at the front lines of care with technology experts in or near their communities. In 2015, we held bootcamps and hackathons in Colombia, Uganda, Greece and Mexico. The bootcamps teach students in technical fields like computer science and engineering how to design and develop health apps that can run on cellphones. Immediately following the bootcamp, the medical providers join the group and the hackathon begins…At the end of the day, though, the purpose is not the apps….(More)

How innovation agencies work


Kirsten Bound and Alex Glennie at NESTA: “This report considers how governments can get better at designing and running innovation agencies, drawing on examples from around the world.

Key findings

  • There is no single model for a ‘successful’ innovation agency.  Although there is much to learn from other countries about best practice in institution and programme design, attempts to directly replicate organisational models that operate in very different contexts are likely to fail.
  • There are a variety of roles that innovation agencies can play. From our case studies, we have identified a number of different approaches that an innovation agency might take, depending on the specific nature of a country’s innovation system, the priorities of policymakers, and available resources.
  • Innovation agencies need a clear mission, but an ability to adapt and experiment. Working towards many different objectives at once or constantly changing strategic direction can make it difficult for an innovation agency to deliver impactful innovation support for businesses. However, a long-term vision of what success looks like should not prevent innovation agencies from experimenting with new approaches, and responding to new needs and opportunities.
  • Innovation agencies should be assessed both quantitatively and qualitatively. Evaluations tend to focus on the financial return they generate, but our research suggests that more effort needs to be put into assessing some of the more qualitative aspects of their role, including the quality of their management, their ability to take (and learn from) strategic risks, and the skill with which they design and implement their programmes.
  • Governments should be both ambitious and realistic about what they expect an innovation agency to achieve. An innovation agency’s role will inevitably be affected by shifts in government priorities. Understanding how innovation agencies shape (and are shaped by) the broader political environment around innovation is a necessary part of ensuring that they are able to deliver on their potential.

Governments around the world are looking for ways to nurture innovative businesses, as a way of solving some of their most urgent economic and societal challenges. Many seek to do this by setting up national innovation agencies: institutions that provide financial and other support to catalyse or drive private sector innovation. Yet we still know relatively little about the range of approaches that these agencies take, what programmes and instruments are likely to work best in a given context, and how to assess their long-term impact.

We have been investigating these questions by studying a diverse group selection of innovation agencies in ten different countries. Our aim has been to improve understanding of the range of existing institutional models and to learn more about their design, evolution and effectiveness. In doing so, we have developed a broad framework to help policymakers think about the set of choices and options they face in the design and management of an innovation agency….(More)”