Open Data Impact: How Zillow Uses Open Data to Level the Playing Field for Consumers


Daniel Castro at US Dept of Commerce: “In the mid-2000s, several online data firms began to integrate real estate data with national maps to make the data more accessible for consumers. Of these firms, Zillow was the most effective at attracting users by rapidly growing its database, thanks in large part to open data. Zillow’s success is based, in part, on its ability to create tailored products that blend multiple data sources to answer customer’s questions about the housing market. Zillow’s platform lets customers easily compare neighborhoods and conduct thorough real estate searches through a single portal. This ensures a level playing field of information for home buyers, sellers and real estate professionals.

The system empowers consumers by providing them all the information needed to make well-informed decisions about buying or renting a home. For example, information from the Census Bureau’s American Community Survey helps answer people’s questions about what kind of housing they can afford in any U.S. market. Zillow also creates market analysis reports, which inform consumer about whether it is a good time to buy or sell, how an individual property’s value is likely to fluctuate over time, or whether it is better to rent or to own in certain markets. These reports can even show which neighborhoods are the top buyers’ or sellers’ markets in a given city. Zillow uses a wide range of government data, not just from the Census Bureau, to produce economic analyses and products it then freely provides to the public.

In addition to creating reports from synthesized data, Zillow has made a conscious effort to make raw data more usable. It has combined rental, mortgage, and other data into granular metrics on individual neighborhoods and zip codes. For example, the “Breakeven Horizon” is a metric that gives users a snapshot of how long they would need to own a home in a given area for the accrued cost of buying to be less than renting. Zillow creates this by comparing the up-front costs of buying a home versus the amount of interest that money could generate, and then analyzing how median rents and home values are likely to fluctuate, affecting both values. By creating metrics, rankings, and indices, Zillow makes raw or difficult-to-quantify data readily accessible to the public.

While real estate agents can be instrumental in the process of finding a new home or selling an old one, Zillow and other platforms add value by connecting consumers to a wealth of data, some of which may have been accessible before but was too cumbersome for the average user. Not only does this allow buyers and sellers to make more informed decisions about real estate, but it also helps to balance the share of knowledge. Buyers have more information than ever before on available properties, their valuations for specific neighborhoods, and how those valuations have changed in relation to larger markets. Sellers can use the same types of information to evaluate offers they receive, or decide whether to list their home in the first place. The success that Zillow and other companies like it have achieved in the real estate market is a testament to how effective they have been in harnessing data to address consumers’ needs and it is a marvelous example of the power of open data….(More)”

Good Governance by All Means


 at Huffington Post: “Citizens today have higher expectations and demand effective solutions to every day issues and challenges. From climate change to expedient postal services, governments are required to act with transparency and diligence. Public accountability demands us, public servants, to act with almost no margin of error and using the most open and transparent means available to achieve our goals. The name of the game is simple: government efforts should focus on building stronger, better and healthier relationships with civil society. Nobody should be left behind when tailoring public policy. For the Mexican Government, it is crystal clear, that such endeavor is no longer the State’s monopoly and thus, the pressing need for governments to use smarter and more efficient tool boxes, such as the one that the Open Government Partnership (OGP), provides. The buzzword is good governance by all means.

The High Level Segment of the 70th Session of the United Nations General Assembly was a milestone for the open government community. It allowed the 13 countries taking part of the OGP Steering Committee and several civil society organizations to endorse the Joint Declaration: Open Government for the Implementation of the 2030 Agenda for Sustainable Development. This declaration highlights the paramount importance of promoting the principles of open government (transparency, accountability, citizen participation and innovation) as key enablers of the Sustainable Development Goals. The Declaration particularly embraces Agenda 2030’s Goal 16 as a common target for all 66 OGP member countries. Our common goal is to continue building stronger institutions while weaving peaceful and inclusive societies. Our meeting in New York also allowed us to work with key players to develop the Open Data Charter that recognizes the value of having timely, comprehensive, accessible, and comparable data for the promotion of greater citizen engagement triggering development and innovation….(More)

Citizen-Generated Data and Governments: Towards a Collaborative Model


Civicus: “…we’re very happy today to launch “Citizen-Generated Data and Governments: Towards a Collaborative Model”.

This piece explores the idea that governments could host and publish citizen-generated data (CGD) themselves, and whether this could mean that data is applied more widely and in a more sustainable way. It was inspired by a recent meeting in Buenos Aires with Argentine civil society organizations and government representatives, hosted by the City of Buenos Aires Innovation and Open Government Lab (Laboratorio de innovación y Gobierno Abierto de la Ciudad de Buenos Aires).

Screen Shot 2015-10-26 at 20.58.06

The meeting was organized to explore how people within government think about citizen-generated data, and discuss what would be needed for them to consider it as a valid method of data generation. One of the most novel and exciting ideas that surfaced was the potential for government open data portals, such as that managed by the Buenos Aires Innovation Lab, to host and publish CGD.

We wrote this report to explore this issue further, looking at existing models of data collaboration and outlining our first thoughts on the benefits and obstacles this kind of model might face. We welcome feedback from those with deeper expertise into different aspects of citizen-generated data, and look forward to refining these thoughts in the future together with the broader community…(More)”

How open company data was used to uncover the powerful elite benefiting from Myanmar’s multi-billion dollar jade industry


OpenCorporates: “Today, we’re pleased to release a white paper on how OpenCorporates data was used to uncover the powerful elite benefiting from Myanmar’s multi-billion dollar jade industry, in a ground-breaking report from Global Witness. This investigation is an important case study on how open company data and identifiers are critical tool to uncover corruption and the links between companies and the real people benefitting from it.

This white paper shows how not only was it critical that OpenCorporates had this information (much of the information was removed from the official register during the investigation), but that the fact that it was machine-readable data, available via an API (data service), and programmatically combinable with other data was essential to discover the hidden connections between the key actors and the jade industry. Global Witness was able to analyse this data with the help of Open Knowledge.

In this white paper, we make recommendations about the collection and publishing of statutory company information as open data to facilitate the creation of a hostile environment for corruption by providing a rigorous framework for public scrutiny and due diligence.

You can find the white paper here or read it on Medium.”

Using data to improve the environment


Sir Philip Dilley at the UK Environment Agency: “We live in a data rich world. As an engineer I know the power of data in the design and implementation of new urban spaces, iconic buildings and the infrastructure on which we all depend.

Data also is a powerful force in helping us to protect the environment and it can be mined from a variety of sources.

Since the Victorian times naturalists have collected data on the natural world. At the Environment Agency we continue to use local enthusiasts to track rainfall, which we use to feed into and support local projections of flood risk. But the advent of computing power and the Government’s move to open data means we can now all use data in a new and exciting way. The result is a more informed approach to improving the environment and protecting people.

For the last 17 years the Environment Agency has used lasers in planes to map and scan the English landscape from above to help us carry out work such as flood modelling (data now available for everyone to use). The same information has been used to track changing coastal habitats and to help us use the power of nature to adapt to a changing climate.

We’ve used our LIDAR height data together with aerial photography to inform the location and design of major coastal realignment sites. The award-winning Medmerry project, which created 183 hectares of new coastal habitat and protects 348 properties from flooding, was based on this data-led approach.

Those who live near rivers or who use them for sport and recreation know the importance of getting up to date information on river flows. We already provide online services to the public so they can see current warnings and river levels information, but opening our data means everyone can get bespoke information through one postcode or location search.

We are not the only ones seeing the power of environmental data. Data entrepreneurs know how to get accurate and easily accessible information to the public. And that means that we can all make informed choices.FloodAlerts provides a graphical representation of flood warnings and gives localised updates every 15 minutes and Flood Risk Finder app provides flood risk profiles on any property in England, both using data made available for public use by the Environment Agency.

Our bathing waters data directs those who like to swim, surf or paddle with vital information on water quality. The Safer Seas Service app alerts water users when water quality is reduced at beaches and our bathing water data is also used by the Marine Conservation Society’s Good Beach Guide….(More)”

Open data, open mind: Why you should share your company data with the world


Mark Samuels at ZDnet: “If information really is the lifeblood of modern organisations, then CIOs could create huge benefits from opening their data to new, creative pairs of eyes. Research from consultant McKinsey suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data: that is, taking previously proprietary data (often starting with public sector data) and opening up access.

So, should your business consider giving outsiders access to insider information? ZDNet speaks to three experts.

More viewpoints can mean better results

Former Tullow Oil CIO Andrew Marks says debates about the potential openness of data in a private sector context are likely to be dominated by one major concern: information security.

“It’s a perfectly reasonable debate until people start thinking about privacy,” he says. “Putting information at risk, both in terms of customer data and competitive advantage, will be a risk too far for many senior executives.”

But what if CIOs could allay c-suite peers’ concerns and create a new opportunity? Marks points to the Goldcorp Challenge, which saw the mining specialist share its proprietary geological data to allow outside experts pick likely spots for mining. The challenge, which included prize money of $575,000 helped identify more than 110 sites, 50 per cent of which were previously unknown to the company. The value of gold found through the competition exceeded $6bn. Marks wonders whether other firms could take similarly brave steps.
“There is a period of time when information is very sensitive,” he says. “Once the value of data starts to become finite, then it might be beneficial for businesses to open the doors and to let outsiders play with the information. That approach, in terms of gamification, might lead to the creation of new ideas and innovations.”…

Marks says these projects help prove that, when it comes to data, more is likely to mean different – and possibly better – results. “Whether using big data algorithms or the human touch, the more viewpoints you bring together, the more you can increases chances of success and reduce risk,” he says.

“There is, therefore, always likely to be value in seeking an alternative perspective. Opening access to data means your firm is going to get more ideas, but CIOs and other senior executives need to think very carefully about what such openness means for the business, and the potential benefits.”….Some leading firms are already taking steps towards openness. Take Christina Scott, chief product and information officer at the Financial Times, who says the media organisation has used data analysts to help push the benefits of information-led insight across the business.

Her team has democratised data in order to make sure that all parts of the organisation can get the information they need to complete their day-to-day jobs. Scott says the approach is best viewed as an open data strategy, but within the safe confines of the existing enterprise firewall. While the tactic is internally focused currently, Scott says the FT is keen to find ways to make the most of external talent in the future.

“We’re starting to consider how we might open data beyond the organisation, too,” she says. “Our data holds a lot of value and insight, including across the metadata we’ve created. So it would be great to think about how we could use that information in a more open way.” Part of the FT’s business includes trade-focused magazines. Scott says opening the data could provide new insight to its B2B customers across a range of sectors. In fact, the firm has already dabbled at a smaller scale.

“We’ve run hackathons, where we’ve exposed our APIs and given people the chance to come up with some new ideas,” she says. “But I don’t think we’ve done as much work on open data as we could. And I think that’s the direction in which better organisations are moving. They recognise that not all innovation is going to happen within the company.”…

CIO Omid Shiraji is another IT expert who recognises that there is a general move towards a more open society. Any executive who expects to work within a tightly defined enterprise firewall is living in cloud cuckoo land, he argues. More to the point, they will miss out on big advantages.
“If you can expose your sources to a range of developers, you can start to benefit from massive innovation,” he says. “You can get really big benefits from opening your data to external experts who can focus on areas that you don’t have the capability to develop internally.”

Many IT leaders would like to open data to outside experts, suggests Shiraji. For CIOs who are keen to expose their sources, he suggests letting small-scale developers take a close look at in-house data silos in an attempt to discover what relationships might exist and what advantages could accrue….(More)”

In post-earthquake Nepal, open data accountability


Deepa Rai at the Worldbank blog: “….Following the earthquake, there was an overwhelming response from technocrats and data crunchers to use data visualizations for disaster risk assessment. The Government of Nepal made datasets available through its Disaster Data Portal and many organizations and individuals also pitched in and produced visual data platforms.
However, the use of open data has not been limited to disaster response. It was, and still is, instrumental in tracking how much funding has been received and how it’s being allocated. Through the use of open data, people can make their own analysis based on the information provided online.

Direct Relief, a not-for-profit company, has collected such information and helped gathered data from the Prime Minister’s relief fund and then created infographics which have been useful for media and immediate distribution on social platforms. MapJournal’s visual maps became vital during the Post Disaster Needs Assessment (PDNA) to assess and map areas where relief and reconstruction efforts were urgently needed.

Direct Relief Medical Relief partner locations
Direct Relief medical relief partner locations in context of population affected and injuries by district
Photo Credit: Data Relief Services

Open data and accountability
However, the work of open data doesn’t end with relief distribution and disaster risk assessment. It is also hugely impactful in keeping track of how relief money is pledged, allocated, and spent. One such web application,openenet.net is making this possible by aggregating post disaster funding data from international and national sources into infographics. “The objective of the system,” reads the website “is to ensure transparency and accountability of relief funds and resources to ensure that it reaches to targeted beneficiaries. We believe that transparency of funds in an open and accessible manner within a central platform is perhaps the first step to ensure effective mobilization of available resources.”
Four months after the earthquake, Nepali media have already started to report on aid spending — or the lack of it. This has been made possible by the use of open data from the Ministry of Home Affairs (MoHA) and illustrates how critical data is for the effective use of aid money.
Open data platforms emerging after the quakes have been crucial in questioning the accountability of aid provisions and ultimately resulting in more successful development outcomes….(More)”

Viscous Open Data: The Roles of Intermediaries in an Open Data Ecosystem


François van Schalkwyk, Michelle Willmers & Maurice McNaughton in Journal: “Information Technology for Development”: “Open data have the potential to improve the governance of universities as public institutions. In addition, open data are likely to increase the quality, efficacy and efficiency of the research and analysis of higher education systems by providing a shared empirical base for critical interrogation and reinterpretation. Drawing on research conducted by the Emerging Impacts of Open Data in Developing Countries project, and using an ecosystems approach, this research paper considers the supply, demand and use of open data as well as the roles of intermediaries in the governance of South African public higher education. It shows that government’s higher education database is a closed and isolated data source in the data ecosystem; and that the open data that are made available by government is inaccessible and rarely used. In contrast, government data made available by data intermediaries in the ecosystem are being used by key stakeholders. Intermediaries are found to play several important roles in the ecosystem: (i) they increase the accessibility and utility of data; (ii) they may assume the role of a “keystone species” in a data ecosystem; and (iii) they have the potential to democratize the impacts and use of open data. The article concludes that despite poor data provision by government, the public university governance open data ecosystem has evolved because intermediaries in the ecosystem have reduced the viscosity of government data. Further increasing the fluidity of government open data will improve access and ensure the sustainability of open data supply in the ecosystem….(More)”

Open Data Charter


International Open Data Charter: “Open data sits at the heart of a global movement with the potential to generate significant social and economic benefits around the world. Through the articulation and adoption of common principles in support of open data, governments can work towards enabling more just, and prosperous societies.

In July 2013, G8 leaders signed the G8 Open Data Charter, which outlined a set of five core open data principles. Many nations and open government advocates welcomed the G8 Charter, but there was a general sense that the principles could be refined and improved to support broader global adoption of open data principles. In the months following, a number of multinational groups initiated their own activities to establish more inclusive and representative open data principles, including the Open Government Partnership’s (OGP) Open Data Working Group….

During 2015, open data experts from governments, multilateral organizations, civil society and private sector, worked together to develop an international Open Data Charter, with six principles for the release of data:

  1. Open by Default;
  2. Timely and Comprehensive;
  3. Accessible and Useable;
  4. Comparable and Interoperable;
  5. For Improved Governance and Citizen Engagement; and
  6. For Inclusive Development and Innovation….

Next Steps

  1. Promote adoption of the Charter.
  2. Continue to bring together a diverse, inclusive group of stakeholders to engage in the process of adoption of the international Open Data Charter.
  3. Develop a governance model for the ongoing management of the Charter, setting out the roles and responsibilities of a Charter partnership, and its working groups in the process of developing supporting resources, consultations, promotion, adoption, and oversight.
  4. Continue development of and consultation on supporting Charter guides, documents and tools, which will be brought together in a searchable, online Resource Centre. ..(More)”

 

Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism


Stefan Baack at Big Data and Society: “This article shows how activists in the open data movement re-articulate notions of democracy, participation, and journalism by applying practices and values from open source culture to the creation and use of data. Focusing on the Open Knowledge Foundation Germany and drawing from a combination of interviews and content analysis, it argues that this process leads activists to develop new rationalities around datafication that can support the agency of datafied publics. Three modulations of open source are identified: First, by regarding data as a prerequisite for generating knowledge, activists transform the sharing of source code to include the sharing of raw data. Sharing raw data should break the interpretative monopoly of governments and would allow people to make their own interpretation of data about public issues. Second, activists connect this idea to an open and flexible form of representative democracy by applying the open source model of participation to political participation. Third, activists acknowledge that intermediaries are necessary to make raw data accessible to the public. This leads them to an interest in transforming journalism to become an intermediary in this sense. At the same time, they try to act as intermediaries themselves and develop civic technologies to put their ideas into practice. The article concludes with suggesting that the practices and ideas of open data activists are relevant because they illustrate the connection between datafication and open source culture and help to understand how datafication might support the agency of publics and actors outside big government and big business….(More)”