New traffic app and disaster prevention technology road tested


Psych.org: “A new smartphone traffic app tested by citizens in Dublin, Ireland allows users to give feedback on traffic incidents, enabling traffic management centres to respond quicker when collisions and other incidents happen around the city. The ‘CrowdAlert’ app, which is now available for download, is one of the key components utilised in the EU-funded INSIGHT project and a good example of how smartphones and social networks can be harnessed to improve public services and safety.

‘We are witnessing an explosion in the quantity, quality, and variety of available information, fuelled in large part by advances in sensor networking, the availability of low-cost sensor-enabled devices and by the widespread adoption of powerful smart-phones,’ explains  coordinator professor Dimitrios Gunopulos from the National and Kapodistrian University of Athens. ‘These revolutionary technologies are driving the development and adoption of applications where mobile devices are used for continuous data sensing and analysis.’

The project also developed a novel citywide real-time traffic monitoring tool, the ‘INSIGHT System’, which was tested in real conditions in the Dublin City control room, along with nationwide disaster monitoring technologies. The INSIGHT system was shown to provide early warnings to experts at situation centres, enabling them to monitor situations in real-time, including disasters with potentially nation-wide impacts such as severe weather conditions, floods and subsequent knock-on events such as fires and power outages.

The project’s results will be of interest to public services, which have until now lacked the necessary infrastructure for handling and integrating miscellaneous data streams, including data from static and mobile sensors as well as information coming from social network sources, in real-time. Providing cities with the ability to manage emergency situations with enhanced capabilities will also open up new markets for network technologies….(More)”

The Human Face of Big Data


A film by Sandy Smolan [56 minutes]: “Big Data is defined as the real time collection, analyses, and visualization of vast amounts of information. In the hands of Data Scientists this raw information is fueling a revolution which many people believe may have as big an impact on humanity going forward as the Internet has over the past two decades. Its enable us to sense, measure, and understand aspects of our existence in ways never before possible.

The Human Face of Big Data captures an extraordinary revolution sweeping, almost invisibly, through business, academia, government, healthcare, and everyday life. It’s already enabling us to provide a healthier life for our children. To provide our seniors with independence while keeping them safe. To help us conserve precious resources like water and energy. To alert us to tiny changes in our health, weeks or years before we develop a life—threatening illness. To peer into our own individual genetic makeup. To create new forms of life. And soon, as many predict, to re—engineer our own species. And we’ve barely scratched the surface…

This massive gathering and analyzing of data in real time is allowing us to address some of humanities biggest challenges. Yet, as Edward Snowden and the release of the NSA documents has shown, the accessibility of all this data can come at a steep price….(More)”

New Human Need Index fills a data void to help those in need


Scott W. Allard at Brookings: “My 2009 book, “Out of Reach,” examined why it can be hard for poor families to get help from the safety net. One critical barrier is the lack of information about local program resources and nonprofit social service organizations. Good information is key to finding help, but also to important if we are to target resources effectively and assess if program investments were successful.

As I prepared data for the book in 2005, my research team struggled to compile useful information about services and programs in the three major metro areas at the center of the study. We grappled with out-of-date print directories, incomplete online listings, bad addresses, disconnected phone numbers, and inaccurate information about the availability of services. It wasn’t clear families experiencing hardship could easily find the help they needed. It also wasn’t clear how potential volunteers or donors could know where to direct their energies, or whether communities could know whether they were deploying adequate and relevant safety net resources. In the book’s conclusion, however, I was optimistic things would get better. A mix of emerging technology, big data systems, and a generation of young entrepreneurs would certainly close these information gaps over the next several years.

Recently, I embarked upon an effort to again identify the social service organizations operating in one of the book’s original study sites. To my surprise, the work was much harder this time around. Print directories are artifacts of the past. Online referral tools provided only spotty coverage. Addresses and service information can still be quite out of date. In many local communities, it felt as if there was less information available now than a decade ago.

Lack of data about local safety net programs, particularly nonprofit organizations, has long been a problem for scholars, community advocates, nonprofit leaders, and philanthropists. Data about providers and populations served are expensive to collect, update, and disseminate. There are no easy ways to monetize data resources or find regular revenue streams to support data work. There are legal obstacles and important concerns about confidentiality. Many organizations don’t have the resources to do much analytic or learning work.

The result is striking. We spend tens of billions of dollars on social services for low-income households each year, but we have only the vaguest ideas of where those dollars go, what impact they have, and where unmet needs exist.

Into this information void steps the Salvation Army and the Lilly Family School of Philanthropy at Indiana University with a possible path forward. Working together and with an advisory board of scholars, the Salvation Army and the Lilly School have created a real-time Human Needs Index drawn from service provision tracking systems maintained by more than 7,000 Salvation Army sites nationwide. The index provides useful insight into consumption of an array of emergency services (e.g., food, shelter, clothing) at a given place and point in time across the entire country…(More)”

Statistical objectivity is a cloak spun from political yarn


Angus Deaton at the Financial Times: “The word data means things that are “given”: baseline truths, not things that are manufactured, invented, tailored or spun. Especially not by politics or politicians. Yet this absolutist view can be a poor guide to using the numbers well. Statistics are far from politics-free; indeed, politics is encoded in their genes. This is ultimately a good thing.

We like to deal with facts, not factoids. We are scandalised when politicians try to censor numbers or twist them, and most statistical offices have protocols designed to prevent such abuse. Headline statistics often seem simple but typically have many moving parts. A clock has two hands and 12 numerals yet underneath there may be thousands of springs, cogs and wheels. Politics is not only about telling the time, or whether the clock is slow or fast, but also about how to design the cogs and wheels. Down in the works, even where the decisions are delegated to bureaucrats and statisticians, there is room for politics to masquerade as science. A veneer of apolitical objectivity can be an effective disguise for a political programme.

Just occasionally, however, the mask drops and the design of the cogs and wheels moves into the glare of frontline politics. Consumer price indexes are leading examples of this. Britain’s first consumer price index was based on spending patterns from 1904. Long before the second world war, these weights were grotesquely outdated. During the war, the cabinet was worried about a wage-price spiral and the Treasury committed to hold the now-irrelevant index below the astonishingly precise value of 201.5 (1914=100) through a policy of food subsidies. It would, for example, respond to an increase in the price of eggs by lowering the price of sugar. Reform of the index would have jeopardised the government’s ability to control it and was too politically risky. The index was not updated until 1947….

These examples show the role of politics needs to be understood, and built in to any careful interpretation of the data. We must always work from multiple sources, and look deep into the cogs and wheels. James Scott, the political scientist, noted that statistics are how the state sees. The state decides what it needs to see and how to see it. That politics infuses every part of this is a testament to the importance of the numbers; lives depend on what they show.

For global poverty or hunger statistics, there is no state and no one’s material wellbeing depends on them. Politicians are less likely to interfere with such data, but this also removes a vital source of monitoring and accountability. Politics is a danger to good data; but without politics data are unlikely to be good, or at least not for long….(More)”

 

Teaching Open Data for Social Movements: a Research Strategy


Alan Freihof Tygel and Maria Luiza Machado Campo at the Journal of Community Informatics: “Since the year 2009, the release of public government data in open formats has been configured as one of the main actions taken by national states in order to respond to demands for transparency and participation by the civil society. The United States and theUnited Kingdom were pioneers, and today over 46 countries have their own Open Government Data Portali , many of them fostered by the Open Government Partnership (OGP), an international agreement aimed at stimulating transparency.

The premise of these open data portals is that, by making data publicly available in re-usable formats, society would take care of building applications and services, and gain value from this data (Huijboom & Broek, 2011). According to the same authors, the discourse around open data policies also includes increasing democratic control and participation and strengthening law enforcement.

Several recent works argue that the impact of open data policies, especially the release of open data portals, is still difficult to assess (Davies & Bawa, 2012; Huijboom & Broek, 2011; Zuiderwijk, Janssen, Choenni, Meijer, & Alibaks, 2012). One important consideration is that “The gap between the promise and reality of OGD [Open Government Data] re-use cannot be addressed by technological solutions alone” (Davies, 2012). Therefore, sociotechnical approaches (Mumford, 1987) are mandatory.

The targeted users of open government data lie over a wide range that includes journalists, non-governmental organizations (NGO), civil society organizations (CSO), enterprises, researchers and ordinary citizens who want to audit governments’ actions. Among them, the focus of our research is on social (or grassroots) movements. These are groups of organized citizens at local, national or international level who drive some political action, normally placing themselves in opposition to the established power relations and claiming rights for oppressed groups.

A literature definition gives a social movement as “collective social actions with a socio-political and cultural approach, which enable distinct forms of organizing the population and expressing their demands” (Gohn, 2011).

Social movements have been using data in their actions repertory with several motivations (as can be seen in Table 1 and Listing 1). From our experience, an overview of several cases where social movements use open data reveals a better understanding of reality and a more solid basis for their claims as motivations. Additionally, in some cases data produced by the social movements was used to build a counter-hegemonic discourse based on data. An interesting example is the Citizen Public Depth Audit Movement which takes place in Brazil. This movement, which is part of an international network, claims that “significant amounts registered as public debt do not correspond to money collected through loans to the country” (Fattorelli, 2011), and thus origins of this debt should be proven. According to the movement, in 2014 45% of Brazil’s Federal spend was paid to debt services.

Recently, a number of works tried to develop comparison schemes between open data strategies (Atz, Heath, & Fawcet, 2015; Caplan et al., 2014; Ubaldi, 2013; Zuiderwijk & Janssen, 2014). Huijboom & Broek (2011) listed four categories of instruments applied by the countries to implement their open data policies:

  • voluntary approaches, such as general recommendations,
  • economic instruments,
  • legislation and control, and
  • education and training.

One of the conclusions is that the latter was used to a lesser extent than the others.

Social movements, in general, are composed of people with little experience of informatics, either because of a lack of opportunities or of interest. Although it is recognized that using data is important for a social movement’s objectives, the training aspect still hinders a wider use of it.

In order to address this issue, an open data course for social movements was designed. Besides building a strategy on open data education, the course also aims to be a research strategy to understand three aspects:

  • the motivations of social movements for using open data;
  • the impediments that block a wider and better use; and
  • possible actions to be taken to enhance the use of open data by social movements….(More)”

Strengthening the Connective Links in Government


John M. Kamensky at the IBM Center for The Business of Government: “Over the past five years, the Obama administration has pursued a host of innovation-fostering initiatives that work to strengthen the connective links among and within federal agencies.

Many factors contribute to the rise of such efforts, including presidential support, statutory encouragement, and an ongoing evolution in the way government does its business. The challenge now is how to solidify the best of them so they remain in place beyond the upcoming 2017 presidential transition.

Increased Use of Collaborative Governance

Dr. Rosemary O’Leary, an astute observer of trends in government, describes how government has steadily increased its use of collaborative approaches in lieu of the traditional hierarchical, bureaucratic approach. According to O’Leary, there are several explanations for this shift:

  • First, “most public challenges are larger than one organization, requiring new approaches to addressing public issues” such as housing, pollution, transportation, and healthcare.
  • Second, collaboration helps to improve the effectiveness and performance of programs “by encouraging new ways of providing services.”
  • Third, technology advances in recent years have helped “organizations and their employees to share information in a way that is integrative and interoperable.”
  • Finally, “citizens are seeking additional avenues for engaging in governance, resulting in new and different forms of collaborative problem solving and decision making.”

Early in his administration, President Barack Obama publicly placed a premium on the use of collaboration. One of his first directives to federal agencies set the tone for how he envisioned his administration would govern, directing agencies to be “collaborative” and “use innovative tools, methods, and systems to cooperate among themselves, across levels of government, and with nonprofits, businesses and individuals.” To that end, the Obama administration undertook a series of supporting actions, including establishing crossagency priority goals around issues such as reducing veteran homelessness, data sharing, and streamlining the sharing of social media licenses between agencies. Tackling many of these issues successfully involved the transformative intersection of innovation and technology.

In 2010, when Congress passed a series of amendments to the Government Performance and Results Act (GPRA), it provided the statutory basis for a broader, more consistent use of collaboration as a way of implementing policies and programs. These changes put in place a series of administrative processes:

  • The designation of agency and cross-agency priority goals
  • The naming of goal leaders
  • The convening of a set of regular progress reviews

Taken together, these legislative changes embedded the value of collaboration into the administrative fabric of the governing bureaucracy. In addition, the evolution of technology tools and the advances in the use of social media has dramatically lowered the technical and bureaucratic barriers to working in a more collaborative environment….(More)”

The Federal Advisory Committee Act: Analysis of Operations and Costs


Wendy Ginsberg at CRS: “Federal advisory committees are established to allow experts from outside the federal government to provide advice and recommendations to executive branch agencies or the President. Federal advisory committees can be created either by Congress, the President, or an executive branch agency. The Federal Advisory Committee Act (FACA) requires agencies to report on the structure, operations, and costs of qualifying federal advisory committees. The General Services Administration (GSA) is authorized to collect, retain, and verify the reported information, and does so using an online tool called the FACA Database.

This report provides an overview of the data that populates the FACA Database, which details the costs and operations of all active federal advisory committees. This report examines the data from FY2004-FY2014, with additional in-depth analysis of FY2014. Generally, the data show that the number of active FACA committees has remained relatively stable over time, hovering around 1,000 committees in any given fiscal year. The Department of Health and Human Services consistently operates the most federal advisory committees, with 264 active committees in FY2014. The Department of Agriculture had the second most active committees in FY2014 with 166. In any given year, around half of the active FACA committees were required to be established by statute. In FY2014, Congress established 10 new FACA committees by statute.

Generally, around 70,000 people serve as members on FACA committees and subcommittees in any given year. In FY2014, 68,179 members served. In FY2014, 825 federal advisory committees held 7,173 meetings and cost more than $334 million to operate. The report provides an in-depth examination of FACA committee operations, using the data collected by GSA. The report concludes by providing a list of policy options that Congress can consider when deliberating current or future legislation to amend FACA….(More)”

Interdisciplinary Perspectives on Trust


Book edited by Shockley, E., Neal, T.M.S., PytlikZillig, L.M., and Bornstein, B.H.:  “This timely collection explores trust research from many angles while ably demonstrating the potential of cross-discipline collaboration to deepen our understanding of institutional trust. Citing, among other things, current breakdowns of trust in prominent institutions, the book presents a multilevel model identifying universal aspects of trust as well as domain- and context-specific variations deserving further study. Contributors analyze similarities and differences in trust across public domains from politics and policing to medicine and science, and across languages and nations. Innovative strategies for measuring and assessing trust also shed new light on this essentially human behavior.

Highlights of the coverage:

  • Consensus on conceptualizations and definitions of trust: are we there yet?
  • Differentiating between trust and legitimacy in public attitudes towards legal authority.
  • Examining the relationship between interpersonal and institutional trust in political and health care contexts.
  • Trust as a multilevel phenomenon across contexts.
  • Institutional trust across cultures.
  • The “dark side” of institutional trust….(more)”

Open Data Impact: How Zillow Uses Open Data to Level the Playing Field for Consumers


Daniel Castro at US Dept of Commerce: “In the mid-2000s, several online data firms began to integrate real estate data with national maps to make the data more accessible for consumers. Of these firms, Zillow was the most effective at attracting users by rapidly growing its database, thanks in large part to open data. Zillow’s success is based, in part, on its ability to create tailored products that blend multiple data sources to answer customer’s questions about the housing market. Zillow’s platform lets customers easily compare neighborhoods and conduct thorough real estate searches through a single portal. This ensures a level playing field of information for home buyers, sellers and real estate professionals.

The system empowers consumers by providing them all the information needed to make well-informed decisions about buying or renting a home. For example, information from the Census Bureau’s American Community Survey helps answer people’s questions about what kind of housing they can afford in any U.S. market. Zillow also creates market analysis reports, which inform consumer about whether it is a good time to buy or sell, how an individual property’s value is likely to fluctuate over time, or whether it is better to rent or to own in certain markets. These reports can even show which neighborhoods are the top buyers’ or sellers’ markets in a given city. Zillow uses a wide range of government data, not just from the Census Bureau, to produce economic analyses and products it then freely provides to the public.

In addition to creating reports from synthesized data, Zillow has made a conscious effort to make raw data more usable. It has combined rental, mortgage, and other data into granular metrics on individual neighborhoods and zip codes. For example, the “Breakeven Horizon” is a metric that gives users a snapshot of how long they would need to own a home in a given area for the accrued cost of buying to be less than renting. Zillow creates this by comparing the up-front costs of buying a home versus the amount of interest that money could generate, and then analyzing how median rents and home values are likely to fluctuate, affecting both values. By creating metrics, rankings, and indices, Zillow makes raw or difficult-to-quantify data readily accessible to the public.

While real estate agents can be instrumental in the process of finding a new home or selling an old one, Zillow and other platforms add value by connecting consumers to a wealth of data, some of which may have been accessible before but was too cumbersome for the average user. Not only does this allow buyers and sellers to make more informed decisions about real estate, but it also helps to balance the share of knowledge. Buyers have more information than ever before on available properties, their valuations for specific neighborhoods, and how those valuations have changed in relation to larger markets. Sellers can use the same types of information to evaluate offers they receive, or decide whether to list their home in the first place. The success that Zillow and other companies like it have achieved in the real estate market is a testament to how effective they have been in harnessing data to address consumers’ needs and it is a marvelous example of the power of open data….(More)”

Smart Citizens, Smarter State


Screen Shot 2015-10-29 at 8.30.59 AMBook by Beth Simone Noveck (TheGovLab): “Government “of the people, by the people, for the people” expresses an ideal that resonates in all democracies. Yet poll after poll reveals deep distrust of institutions that seem to have left “the people” out of the equation. Government bureaucracies that are supposed to solve critical problems on their own are a troublesome outgrowth of the professionalization of public life in the industrial age.They are especially ill-suited to confronting today’s complex challenges. Offering a far-reaching program for innovation, Smart Citizens, Smarter State suggests that public decision-making could be more effective and legitimate if our institutions knew how to use technology to leverage citizens’ expertise.

Drawing on a wide range of disciplines and practical examples from her work as an adviser to governments on innovation, Noveck explores how to create more open and collaborative institutions. She puts forward a profound new vision for participatory democracy rooted not in the paltry act of occasional voting or the serendipity of crowdsourcing, but in people’s knowledge and know-how.”

Check out http://smarterstate.org/