The Climatologist’s Almanac


Clara Chaisson at onEarth: “Forget your weather app with its five- or even ten-day forecasts—a supercomputer at NASA has just provided us with high-resolution climate projections through the end of the century. The massive new 11-terabyte data set combines historical daily temperatures and precipitation measurements with climate simulations under two greenhouse gas emissions scenarios. The project spans from 1950 to 2100, but users can easily zero in on daily timescales for their own locales—which is precisely the point.

The projections can be found on Amazon for free for all to see and plan by. The space agency hopes that developing nations and poorer communities that may not have any spare supercomputers lying around will use the info to predict and prepare for climate change. …(More)”

Architecting Transparency: Back to the Roots – and Forward to the Future?


Paper by Dieter Zinnbauer: “Where to go next in research and practice on information disclosure and institutional transparency? Where to learn and draw inspiration from? How about if we go back to the roots and embrace an original, material notion of transparency as the quality of a substance or element to be see-through? How about, if we then explore how the deliberate use and assemblage of such physical transparency strategies in architecture and design connects to – or could productively connect to – the institutional, political notions of transparency that we are concerned with in our area of institutional or political transparency? Or put more simply and zooming in on one core aspect of the conversation: what have the arrival of glass and its siblings done for democracy and what can we still hope they will do for open, transparent governance now and in the future?

This paper embarks upon this exploratory journey in four steps. It starts out (section 2.1) by revisiting the historic relationship between architecture, design and the build environment on the one side and institutional ambitions for democracy, openness, transparency and collective governance on the other side. Quite surprisingly it finds a very close and ancient relationship between the two. Physical and political transparency have through the centuries been joined at the hip and this relationship – overlooked as it is typically is – has persisted in very important ways in our contemporary institutions of governance. As a second step I seek to trace the major currents in the architectural debate and practice on transparency over the last century and ask three principal questions:

– How have architects as the master-designers of the built environment in theory, criticism and practice historically grappled with the concept of transparency? To what extent have they linked material notions and building strategies of transparency to political and social notions of transparency as tools for emancipation and empowerment? (section 2.2.)

– What is the status of transparency in architecture today and what is the degree of cross-fertilisation between physical and institutional/political transparency? (section 3)

– Where could a closer connect between material and political transparency lead us in terms of inspiring fresh experimentation and action in order to broaden the scope of available transparency tools and spawn fresh ideas and innovation? (section 4).

Along the way I will scan the fragmented empirical evidence base for the actual impact of physical transparency strategies and also flag interesting areas for future research. As it turns out, an obsession with material transparency in architecture and the built environment has evolved in parallel and in many ways predates the rising popularity of transparency in political science and governance studies. There are surprising parallels in the hype-and-skepticism curve, common challenges, interesting learning experiences and a rich repertoire of ideas for cross-fertilisation and joint ideation that is waiting to be tapped. However, this will require to find ways to bridge the current disconnect between the physical and institutional transparency professions and move beyond the current pessimism about an actual potential of physical transparency beyond empty gestures or deployment for surveillance, notions that seems to linger on both sides. But the analysis shows that this bridge-building could be an extremely worthwhile endeavor. Both the available empirical data, as well as the ideas that even just this first brief excursion into physical transparency has yielded bode well for embarking on this cross-disciplinary conversation about transparency. And as the essay also shows, help from three very unexpected corners might be on the way to re-ignite the spark for taking the physical dimension of transparency seriously again. Back to the roots has a bright future….(More)

Safecity: Combatting Sexual Violence Through Technology


Safecity, …. is a not for profit organization that provides a platform for people to share their personal stories of sexual harassment and abuse in public spaces. This data, which may be anonymous, gets aggregated as hot spots on a map indicating trends at a local level. The idea is to make this data useful for individuals, local communities and local administration for social and systemic change for safer cities. We launched on 26 Dec 2012 and since then have collected over 4000 stories from over 50 cities in India and Nepal.

How can Safecity help?
Safecity is a crowd map that converts these individual stories into data that is then plotted on a map. It is then easier to see trends at the location level (e.g. a street). The focus is taken away from the individual victim and instead we can focus on solving the problem at the local neighborhood level.

The Objectives:
• Create awareness on street harassment and abuse and get people, especially women, victims of hate and LGBTQ crimes to break their silence and report their personal experiences.
• Collate this information to showcase location based trends.
• Make this information available and useful for individuals, local communities and local administration to solve the problem at the local level through urban planning aimed at addressing infrastructural deficits
• Establish successful models of community engagement using crowd sourced data to solve civic and local issues.
• Reach out to women who do not have equal access to technology through our Missed dial facility for them to report any cases of abuse and harassment.

We wish to take this data forward to lobby for systemic change in terms of urban planning and infrastructure, reforms in our law that are premised on gender equity, and social changes to loosen the shackles that do not allow us otherwise to live the way we want to, with the freedom we want to, and with the rights that are fundamental to all of us, and it will just build our momentum further by having as many passionate, concerned and diverse genders on board.

We are trying to build a movement by collecting these reports through campaigns, workshops and awareness programs with schools, colleges, local communities and partners with shared vision. Crime against women has been rampant and largely remains unreported even till date. That silence needs to gain a voice and the time is now. We are determined to highlight this serious social issue and we believe we are taking a step towards changing the way our society thinks and reacts and are hopeful that so are you. In time we hope it will lead to a safe and non-violent environment for all.

Safecity uses technology to document sexual harassment and abuse in public spaces in the following way. People can report incidents of sexual abuse and street harassment, that they have experienced or witnessed. They can share solutions that can help avoid such situations and decide for themselves what works best for them, their geographic location or circumstances.

By allowing people to pin such incidents on a crowd-sourced map, we aim to let them highlight the “hotspots” of such activities. This accentuates the emerging trend in a particular area, enabling the citizens to acknowledge the problem, take personal precautions and devise a solution at the neighbourhood level.

Safecity believes in uniting millions of voices that can become a catalyst for change.

You can read the FAQs section for more information on how the data is used for public good. (More)”

Field experimenting in economics: Lessons learned for public policy


Robert Metcalfe at OUP Blog: “Do neighbourhoods matter to outcomes? Which classroom interventions improve educational attainment? How should we raise money to provide important and valued public goods? Do energy prices affect energy demand? How can we motivate people to become healthier, greener, and more cooperative? These are some of the most challenging questions policy-makers face. Academics have been trying to understand and uncover these important relationships for decades.

Many of the empirical tools available to economists to answer these questions do not allow causal relationships to be detected. Field experiments represent a relatively new methodological approach capable of measuring the causal links between variables. By overlaying carefully designed experimental treatments on real people performing tasks common to their daily lives, economists are able to answer interesting and policy-relevant questions that were previously intractable. Manipulation of market environments allows these economists to uncover the hidden motivations behind economic behaviour more generally. A central tenet of field experiments in the policy world is that governments should understand the actual behavioural responses of their citizens to changes in policies or interventions.

Field experiments represent a departure from laboratory experiments. Traditionally, laboratory experiments create experimental settings with tight control over the decision environment of undergraduate students. While these studies also allow researchers to make causal statements, policy-makers are often concerned subjects in these experiments may behave differently in settings where they know they are being observed or when they are permitted to sort out of the market.

For example, you might expect a college student to contribute more to charity when she is scrutinized in a professor’s lab than when she can avoid the ask altogether. Field experiments allow researchers to make these causal statements in a setting that is more generalizable to the behaviour policy-makers are directly interested in.

To date, policy-makers traditionally gather relevant information and data by using focus groups, qualitative evidence, or observational data without a way to identify causal mechanisms. It is quite easy to elicit people’s intentions about how they behave with respect to a new policy or intervention, but there is increasing evidence that people’s intentions are a poor guide to predicting their behaviour.

However, we are starting to see a small change in how governments seek to answer pertinent questions. For instance, the UK tax office (Her Majesty’s Revenue and Customs) now uses field experiments across some of its services to improve the efficacy of scarce taxpayers money. In the US, there are movements toward gathering more evidence from field experiments.

In the corporate world, experimenting is not new. Many of the current large online companies—such as Amazon, Facebook, Google, and Microsoft—are constantly using field experiments matched with big data to improve their products and deliver better services to their customers. More and more companies will use field experiments over time to help them better set prices, tailor advertising, provide a better customer journey to increase welfare, and employ more productive workers…(More).

See also Field Experiments in the Developed World: An Introduction (Oxford Review of Economic Policy)

CMS announces entrepreneurs and innovators to access Medicare data


Centers for Medicare and Medicaid Services Press Release: “…the acting Centers for Medicare & Medicaid Services (CMS) Administrator, Andy Slavitt, announced a new policy that for the first time will allow innovators and entrepreneurs to access CMS data, such as Medicare claims. As part of the Administration’s commitment to use of data and information to drive transformation of the healthcare delivery system, CMS will allow innovators and entrepreneurs to conduct approved research that will ultimately improve care and provide better tools that should benefit health care consumers through a greater understanding of what the data says works best in health care. The data will not allow the patient’s identity to be determined, but will provide the identity of the providers of care. CMS will begin accepting innovator research requests in September 2015.

“Data is the essential ingredient to building a better, smarter, healthier system. Today’s announcement is aimed directly at shaking up health care innovation and setting a new standard for data transparency,” said acting CMS Administrator Andy Slavitt. “We expect a stream of new tools for beneficiaries and care providers that improve care and personalize decision-making.”

Innovators and entrepreneurs will access data via the CMS Virtual Research Data Center (VRDC) which provides access to granular CMS program data, including Medicare fee-for-service claims data, in an efficient and cost effective manner. Researchers working in the CMS VRDC have direct access to approved privacy-protected data files and are able to conduct their analysis within a secure CMS environment….

Examples of tools or products that innovators and entrepreneurs might develop include care management or predictive modeling tools, which could greatly benefit the healthcare system, in the form of healthier people, better quality, or lower cost of care. Even though all data is privacy-protected, researchers also will not be allowed to remove patient-level data from the VRDC. They will only be able to download aggregated, privacy-protected reports and results to their own personal workstation.  …(More)”

New ODI research shows open data reaching every sector of UK industry


ODI: “New research has been published today (1 June) by the Open Data Institute showing that open data is reaching every sector of UK industry.

In various forms, open data is being adopted by a wide variety of businesses – small and large, new and old, from right across the country. The findings from Open data means business: UK innovation across sectors and regions draw on 270 companies with a combined turnover of £92bn and over 500k employees, identified by the ODI as using, producing or investing in open data as part of their business. The project included desk research, surveys and interviews on the companies’ experiences.

Key findings from the research include:

  • Companies using open data come from many sectors; over 46% from outside the information and communication sector. These include finance & insurance, science & technology, business administration & support, arts & entertainment, health, retail, transportation, education and energy.
  • The most popular datasets for companies aregeospatial/mapping data (57%), transport data (43%) and environment data (42%).
  • 39% of companies innovating with open data are over 10 years old, with some more than 25 years old, proving open data isn’t just for new digital startups.
  • ‘Micro-enterprises’ (businesses with fewer than 10 employees) represented 70% of survey respondents, demonstrating athriving open data startup scene. These businesses are using it to create services, products and platforms. 8% of respondents were drawn from large companies of 251 or more employees….
  • The companies surveyed listed 25 different government sources for the data they use. Notably, Ordnance Survey data was cited most frequently, by 14% of the companies. The non-government source most commonly used was OpenStreetMap, an openly licenced map of the world created by volunteers….(More)

Navigating the Health Data Ecosystem


New book on O’Reilly Media on “The “Six C’s”: Understanding the Health Data Terrain in the Era of Precision Medicine”: “Data-driven technologies are now being adopted, developed, funded, and deployed throughout the health care market at an unprecedented scale. But, as this O’Reilly report reveals, health care innovation contains more hurdles and requires more finesse than many tech startups expect. By paying attention to the lessons from the report’s findings, innovation teams can better anticipate what they’ll face, and plan accordingly.

Simply put, teams looking to apply collective intelligence and “big data” platforms to health and health care problems often don’t appreciate the messy details of using and making sense of data in the heavily regulated hospital IT environment. Download this report today and learn how it helps prepare startups in six areas:

  1. Complexity: An enormous domain with noisy data not designed for machine consumption
  2. Computing: Lack of standard, interoperable schema for documenting human health in a digital format
  3. Context: Lack of critical contextual metadata for interpreting health data
  4. Culture: Startup difficulties in hospital ecosystems: why innovation can be a two-edged sword
  5. Contracts: Navigating the IRB, HIPAA, and EULA frameworks
  6. Commerce: The problem of how digital health startups get paid

This report represents the initial findings of a study funded by a grant from the Robert Wood Johnson Foundation. Subsequent reports will explore the results of three deep-dive projects the team pursued during the study. (More)”

Montreal plans to become a Smart City with free WiFi and open data


Ian Hardy at MobileSyrup: “Earlier this month, the Coderre Administration announced the Montreal Action Plan that includes 70 projects that will turn Montreal into a “smart city.”

The total allocated budget of $23 million is broken down into 6 sections — listed below with the official description — and is targeted for completion by the end of 2017. Apart from ensuring a fast fiber network, “unleashing municipal data,” and the rollout of “intelligent transport systems” that will bring your real-time info on your subway/bus/car service, the city plans to deploy free WiFi.

According to the statement, Montreal will be deploying wireless access points in 750 locations to have facilitate free public WiFi. The larger idea is to “enhance the experience of citizens, boost tourism and accelerate economic development of Montreal.”…

1. Wi-Fi public: Deploy APs to extend coverage in the area, creating a harmonized experience and provide uniform performance across the network to enhance the experience of citizens, boost tourism and accelerate the economic development of Montreal.

2. Very high speed network, multiservice: Adopt a telecommunications policy, create one-stop telecommunications and urban integrate the telecommunications component in the charter of all major urban projects, so that all players in the Montreal community have access a fiber network at high speed and multi-service, that meets their current and future needs.

3. Economic Niche smart city: Create an environment facilitating the emergence of companies in the smart city economic niche, multiply the sources of innovation for solving urban problems and simplify doing business with the City, so that Montreal becoming a leader in innovation as smart city and accelerate economic development.

4. Intelligent Mobility: Make available all data on mobility in real time, implement intelligent transport systems, intermodal and integrated deployment and support solutions designed to inform users to optimize mobility users in real time on the entire territory.

5. Participatory democracy: Unleashing municipal data, information management and governance and adapt the means of citizen participation to make them accessible online, to improve access to the democratic process and consolidate the culture of transparency and accountability.

6. Digital Public Services: Making a maximum of services available on a multitude of digital channels, involve citizens in the development of services and create opportunities for all, to become familiar with their use, to provide access to municipal services 24/7, across multiple platforms….(More)”

Smart Cities, Smart Governments and Smart Citizens: A Brief Introduction


Paper by Gabriel Puron Cid et al in the International Journal of E-Planning Research (IJEPR): “Although the field of study surrounding the “smart city” is in an embryonic phase, the use of information and communication technologies (ICT) in urban settings is not new (Dameri and Rosenthal-Sabroux, 2014; Toh and Low, 1993; Tokmakoff and Billington, 1994). Since ancient times, cities and metropolitan areas have propelled social transformation and economic prosperity in many societies (Katz and Bradley, 2013). Many modern urban sites and metros have leveraged the success and competitiveness of ICTs (Caragliu, Del Bo and Nijkamp, 2011). At least in part, the recent growth of smart city initiatives can be attributed to the rapid adoption of mobile and sensor technologies, as well as the diversity of available Internet applications (Nam and Pardo, 2011; Oberti and Pavesi, 2013).

The effective use of technological innovations in urban sites has been embraced by the emergent term “smart city”, with a strong focus on improving living conditions, safeguarding the sustainability of the natural environment, and engaging with citizens more effectively and actively (Dameri and Rosenthal-Sabroux, 2014). Also known as smart city, digital city, or intelligent city, many of these initiatives have been introduced as strategies to improve the utilization of physical infrastructure (e.g., roads and utility grids), engage citizens in active local governance and decision making, foster sustainable growth, and help government officials learn and innovate as the environment changes….(More)”

The Hague Declaration on Knowledge Discovery in the Digital Age


The Hague Declaration: “New technologies are revolutionising the way humans can learn about the world and about themselves. These technologies are not only a means of dealing with Big Data1, they are also a key to knowledge discovery in the digital age; and their power is predicated on the increasing availability of data itself. Factors such as increasing computing power, the growth of the web, and governmental commitment to open access2 to publicly-funded research are serving to increase the availability of facts, data and ideas.

However, current legislative frameworks in different legal jurisdictions may not be cast in a way which supports the introduction of new approaches to undertaking research, in particular content mining. Content mining is the process of deriving information from machine-readable material. It works by copying large quantities of material, extracting the data, and recombining it to identify patterns and trends.

At the same time, intellectual property laws from a time well before the advent of the web limit the power of digital content analysis techniques such as text and data mining (for text and data) or content mining (for computer analysis of content in all formats)3. These factors are also creating inequalities in access to knowledge discovery in the digital age. The legislation in question might be copyright law, law governing patents or database laws – all of which may restrict the ability of the user to perform detailed content analysis.

Researchers should have the freedom to analyse and pursue intellectual curiosity without fear of monitoring or repercussions. These freedoms must not be eroded in the digital environment. Likewise, ethics around the use of data and content mining continue to evolve in response to changing technology.

Computer analysis of content in all formats, that is content mining, enables access to undiscovered public knowledge and provides important insights across every aspect of our economic, social and cultural life. Content mining will also have a profound impact for understanding society and societal movements (for example, predicting political uprisings, analysing demographical changes). Use of such techniques has the potential to revolutionise the way research is performed – both academic and commercial….(More: Declaration (PDF); Infographic)”