Crowdsourcing Mobile App Takes the Globe’s Economic Pulse


Tom Simonite in MIT Technology Review: “In early September, news outlets reported that the price of onions in India had suddenly spiked nearly 300 percent over prices a year before. Analysts warned that the jump in price for this food staple could signal an impending economic crisis, and the Research Bank of India quickly raised interest rates.
A startup company called Premise might’ve helped make the response to India’s onion crisis timelier. As part of a novel approach to tracking the global economy from the bottom up, the company has a daily feed of onion prices from stores around India. More than 700 people in cities around the globe use a mobile app to log the prices of key products in local stores each day.

Premise’s cofounder David Soloff says it’s a valuable way to take the pulse of economies around the world, especially since stores frequently update their prices in response to economic pressures such as wholesale costs and consumer confidence. “All this information is hiding in plain sight on store shelves,” he says, “but there’s no way of capturing and aggregating it in any meaningful way.”
That information could provide a quick way to track and even predict inflation measures such as the U.S. Consumer Price Index. Inflation figures influence the financial industry and are used to set governments’ monetary and fiscal policy, but they are typically updated only once a month. Soloff says Premise’s analyses have shown that for some economies, the data the company collects can reliably predict monthly inflation figures four to six weeks in advance. “You don’t look at the weather forecast once a month,” he says….
Premise’s data may have other uses outside the financial industry. As part of a United Nations program called Global Pulse, Cavallo and PriceStats, which was founded after financial professionals began relying on data from an ongoing academic price-indexing effort called the Billion Prices Project, devised bread price indexes for several Latin American countries. Such indexes typically predict street prices and help governments and NGOs spot emerging food crises. Premise’s data could be used in the same way. The information could also be used to monitor areas of the world, such as Africa, where tracking online prices is unreliable, he says.”

Free Software Ties the Internet of Things Together


Rachel Metz in MIT Technology Review: “OpenRemote is an open-source Internet of Things platform that could help spur smarter homes and cities.
If you buy several Internet-connected home gadgets—say, a “smart” thermostat, “smart” door lock, and “smart” window blinds—you’ll likely have to control each one with a separate app, meaning it exists in its own little silo.
That’s not how Elier Ramirez does it. In his home, an iPad app controls his lights, ceiling fans, and TV and stereo. Pressing a single button within the app can shut off all his lights and gadgets when he leaves.
Ramirez can tap a lamp in an image to turn an actual lamp off and on in his apartment, and at the same time he’ll see the picture on the tablet’s screen go dark or become illuminated. Ramirez also set up a presence-sensing feature that uses his cell phone to determine if he’s home (it checks whether or not he has connected to his home Wi-Fi network). This can automatically turn on the lights if he’s there. Ramirez runs the whole setup from a small computer in his home.
The software behind all this interconnection comes from a company called OpenRemote, which is plugging away on an open-source software platform for linking Internet-connected gadgets, making it easier to control all kinds of smart home devices, regardless of who made them. And it makes it easy to automate actions like lowering your connected window blinds if the temperature sensed in your living room goes above 75 degrees….
OpenRemote also sees a moneymaking opportunity beyond the home in providing its software to cities, which are becoming increasingly interested in using technology for everything from communicating with citizens to monitoring traffic. Last year, OpenRemote conducted a small test in Eindhoven, in hopes of using automation and crowdsourcing to monitor a city. This included people-tracking with cameras, sound-level tracking, social-media monitoring, and an app that people in the area could use to rate what the atmosphere was like. The company is currently working on a larger-scale project in Eindhoven, Kil says. “If you put four walls around a city, it’s a big room, if you know what I mean,” he says.”

Seven Principles for Big Data and Resilience Projects


PopTech & Rockefeler Bellagio Fellows: “The following is a draft “Code of Conduct” that seeks to provide guidance on best practices for resilience building projects that leverage Big Data and Advanced Computing. These seven core principles serve to guide data projects to ensure they are socially just, encourage local wealth- & skill-creation, require informed consent, and be maintainable over long timeframes. This document is a work in progress, so we very much welcome feedback. Our aim is not to enforce these principles on others but rather to hold ourselves accountable and in the process encourage others to do the same. Initial versions of this draft were written during the 2013 PopTech & Rockefeller Foundation workshop in Bellagio, August 2013.
Open Source Data Tools – Wherever possible, data analytics and manipulation tools should be open source, architecture independent and broadly prevalent (R, python, etc.). Open source, hackable tools are generative, and building generative capacity is an important element of resilience….
Transparent Data Infrastructure – Infrastructure for data collection and storage should operate based on transparent standards to maximize the number of users that can interact with the infrastructure. Data infrastructure should strive for built-in documentation, be extensive and provide easy access. Data is only as useful to the data scientist as her/his understanding of its collection is correct…
Develop and Maintain Local Skills – Make “Data Literacy” more widespread. Leverage local data labor and build on existing skills. The key and most constraint ingredient to effective data solutions remains human skill/knowledge and needs to be retained locally. In doing so, consider cultural issues and language. Catalyze the next generation of data scientists and generate new required skills in the cities where the data is being collected…
Local Data Ownership – Use Creative Commons and licenses that state that data is not to be used for commercial purposes. The community directly owns the data it generates, along with the learning algorithms (machine learning classifiers) and derivatives. Strong data protection protocols need to be in place to protect identities and personally identifying information…
Ethical Data Sharing – Adopt existing data sharing protocols like the ICRC’s (2013). Permission for sharing is essential. How the data will be used should be clearly articulated. An opt in approach should be the preference wherever possible, and the ability for individuals to remove themselves from a data set after it has been collected must always be an option. Projects should always explicitly state which third parties will get access to data, if any, so that it is clear who will be able to access and use the data…
Right Not To Be Sensed – Local communities have a right not to be sensed. Large scale city sensing projects must have a clear framework for how people are able to be involved or choose not to participate. All too often, sensing projects are established without any ethical framework or any commitment to informed consent. It is essential that the collection of any sensitive data, from social and mobile data to video and photographic records of houses, streets and individuals, is done with full public knowledge, community discussion, and the ability to opt out…
Learning from Mistakes – Big Data and Resilience projects need to be open to face, report, and discuss failures. Big Data technology is still very much in a learning phase. Failure and the learning and insights resulting from it should be accepted and appreciated. Without admitting what does not work we are not learning effectively as a community. Quality control and assessment for data-driven solutions is notably harder than comparable efforts in other technology fields. The uncertainty about quality of the solution is created by the uncertainty inherent in data…”

Five Ways to Make Government Procurement Better


Mark Headd at Civic Innovations:  “Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of healthcare.gov.
There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.
Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken…
With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.
Raise the threshold on simplified / streamlined procurement
Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.
Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.
Identify clear standards for projects
Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:

  • The various components that a system will use.
  • The environment in which it will be housed.
  • The testing it must undergo prior to final acceptance.

…can go a long way to reduce the risk an uncertainly inherent in IT projects.
It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.
Require open source
Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.
In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.
Develop a more robust internal capacity for IT project management and implementation
Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.
Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.
Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.
Make contracting, lobbying and campaign contribution data public as open data
One of the more disheartening revelations to come out of the analysis of healthcare.gov implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.
This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.
In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.
Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.
None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.
One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.”

Democratic Reason: Politics, Collective Intelligence, and the Rule of the Many


New book by Hélène Landemore: “Individual decision making can often be wrong due to misinformation, impulses, or biases. Collective decision making, on the other hand, can be surprisingly accurate. In Democratic Reason, Hélène Landemore demonstrates that the very factors behind the superiority of collective decision making add up to a strong case for democracy. She shows that the processes and procedures of democratic decision making form a cognitive system that ensures that decisions taken by the many are more likely to be right than decisions taken by the few. Democracy as a form of government is therefore valuable not only because it is legitimate and just, but also because it is smart.
Landemore considers how the argument plays out with respect to two main mechanisms of democratic politics: inclusive deliberation and majority rule. In deliberative settings, the truth-tracking properties of deliberation are enhanced more by inclusiveness than by individual competence. Landemore explores this idea in the contexts of representative democracy and the selection of representatives. She also discusses several models for the “wisdom of crowds” channeled by majority rule, examining the trade-offs between inclusiveness and individual competence in voting. When inclusive deliberation and majority rule are combined, they beat less inclusive methods, in which one person or a small group decide. Democratic Reason thus establishes the superiority of democracy as a way of making decisions for the common good.”

Connecting Grassroots and Government for Disaster Response


New Report by John Crowley for the Wilson Center: “Leaders in disaster response are finding it necessary to adapt to a new reality. Although community actions have always been the core of the recovery process, collective action from the grassroots has changed response operations in ways that few would have predicted. Using new tools that interconnect over expanding mobile networks, citizens can exchange information via maps and social media, then mobilize thousands of people to collect, analyze, and act on that information. Sometimes, community-sourced intelligence may be fresher and more accurate than the information given to the responders who provide aid…
Also see the companion report from our September 2012 workshop, written by Ryan Burns and Lea Shanley, as well as a series of videos from the workshop and podcasts with workshop participants.”

Democracy and Political Ignorance


Essay by Ilya Somin in Special issue on Is Smaller Government Smarter Government? of Cato Unbound: ” Democracy is supposed to be rule of the people, by the people, and for the people. But in order to rule effectively, the people need political knowledge. If they know little or nothing about government, it becomes difficult to hold political leaders accountable for their performance. Unfortunately, public knowledge about politics is disturbingly low. In addition, the public also often does a poor job of evaluating the political information they do know. This state of affairs has persisted despite rising education levels, increased availability of information thanks to modern technology, and even rising IQ scores. It is mostly the result of rational behavior, not stupidity. Such widespread and persistent political ignorance and irrationality strengthens the case for limiting and decentralizing the power of government….
Political ignorance in America is deep and widespread. The current government shutdown fight provides some good examples. Although Obamacare is at the center of that fight and much other recent political controversy, 44% percent of the public do not even realize it is still the law. Some 80 percent, according to a recent Kaiser survey, say they have heard “nothing at all” or “only a little” about the controversial insurance exchanges that are a major part of the law….
Some people react to data like the above by thinking that the voters must be stupid. Butpolitical ignorance is actually rational for most of the public, including most smart people. If your only reason to follow politics is to be a better voter, that turns out not be much of a reason at all. That is because there is very little chance that your vote will actually make a difference to the outcome of an election (about 1 in 60 million in a presidential race, for example).2 For most of us, it is rational to devote very little time to learning about politics, and instead focus on other activities that are more interesting or more likely to be useful. As former British Prime Minister Tony Blair puts it, “[t]he single hardest thing for a practising politician to understand is that most people, most  of the time, don’t give politics a first thought all day long. Or if they do, it is with a sigh…. before going back to worrying about the kids, the parents, the mortgage, the boss, their friends, their weight, their health, sex and rock ‘n’ roll.”3 Most people don’t precisely calculate the odds that their vote will make a difference. But they probably have an intuitive sense that the chances are very small, and act accordingly.
In the book, I also consider why many rationally ignorant people often still bother to vote.4 The key factor is that voting is a lot cheaper and less time-consuming than studying political issues. For many, it is rational to take the time to vote, but without learning much about the issues at stake….
Political ignorance is far from the only factor that must be considered in deciding the appropriate size, scope, and centralization of government. For example, some large-scale issues, such as global warming, are simply too big to be effectively addressed by lower-level governments or private organizations. Democracy and Political Ignorance is not a complete theory of the proper role of government in society. But it does suggest that the problem of political ignorance should lead us to limit and decentralize government more than we would otherwise.”
See also:  Ilya Somin, Democracy and Political Ignorance: Why Smaller Government is Smarter, (Stanford: Stanford University Press, 2013)

Collaborative Internet Governance: Terms and Conditions of Analysis


New paper by Mathieu O’Neil in the special issue on Contested Internet Governance of the Revue française d’études américaines: “Online projects are communities of practice which attempt to bypass the hierarchies of everyday life and to create autonomous institutions and forms of organisation. A wealth of theoretical frameworks have been put forward to account for these networked actors’ capacity to communicate and self-organise. This article reviews terminology used in Internet research and assesses what it implies for the understanding of regulatory-oriented collective action. In terms of the environment in which interpersonal communication occurs, what differences does it make to speak of “public spheres” or of “public spaces”? In terms of social formations, of “organisations” or “networks”? And in terms of the diffusion of information over the global network, of “contagion” or “trajectories”? Selecting theoretical frames is a momentous decision for researchers, as it authorises or forbids the analysis of different types of behaviour and practices”.-
Other papers on Internet Governance in the Revue:
Divina Frau-Meigs  (Ed.).  Conducting Research on the Internet and its Governance
The Internet and its Governance: A General Bibliography
Glossary of Key Terms and Notions about Internet Governance
Julia Pohle et Luciano Morganti   The Internet Corporation for Assigned Names and Numbers (ICANN): Origins, Stakes and Tensions
Francesca Musiani et al.   Net Neutrality as an Internet Governance Issue: The Globalization of an American-Born Debate
Jeanette Hofmann   Narratives of Copyright Enforcement: The Upward Ratchet and the Sleeping Giant
Elizabeth Dubois et William H. Dutton   The Fifth Estate in Internet Governance: Collective Accountability of a Canadian Policy Initiative
Mathieu O’Neil   Collaborative Internet Governance: Terms and Conditions of Analysis
Peng Hwa Ang et Natalie Pang  Globalization of the Internet, Sovereignty or Democracy: The Trilemma of the Internet Governance Forum

Sir Tim Berners-Lee: The many meanings of Open


Sir Tim Berners-Lee; ” I was recently asked to talk about the idea of “open”, and I realized the term is used in at least eight different ways. The distinct interpretations are all important in different but interlocking ways. Getting them confused leads to a lot of misunderstanding, so it’s good to review them all.
When we tease apart their meanings, we can understand more clearly which aspects of each are the most important. The first, one of the most important forms of openness for the Web, is its universality.
Universality – When I designed the Web protocols, I had already seen many networked information systems fail because they made some assumptions about the users – that they were using a particular type of computer for instance – or constrained the way they worked, such as forcing them to organize their data in a particular way, or to use a particular data format. The Web had to avoid these issues. The goal was that anyone should be able to publish anything on the Web and so it had to be universal in that it was independent of all these technical constraints, as well as language, character sets, and culture….
Open Standards
The actual design of the Web involved the creation of open standards – and getting people to agree to use them globally. The World Wide Web Consortium (W3C), of which I am the Director, helps create interoperable standards for Web technology, including HTML5, mobile Web, graphics, the Semantic Web of linked data, and Web accessibility. Any company can join and anyone can review and help create the specifications for the Web….
Open Web Platform (OWP)
W3C’s Open Web Platform is the name for a particular set of open standards which enable an exciting stage of Web computing. Standards such as HTML5, SVG, CSS, video, JavaScript, and others are advancing together so that programmes that once worked only on desktop, tablets or phones can now work from  within the browser itself. It has all the power of HTML5, like easily-inserted video and, in the future, easily-inserted conferences. It also features the APIs for accessing hardware and other capabilities on the device, such as a smartphone’s accelerometer, camera, and local storage. While native apps are limited, Web Apps can work on any platform….
Open Government through Open Data
In 2009, I resolved to encourage more use of data on the Web. Too many websites could generate nice reports as documents, but had no way to access the data behind it to check and build on the results.  In February that year I stood up in front of a TED audience and asked them for their data; I even got them to chant: “raw data now”.  In April that year, I met with Gordon Brown, then Prime Minister of the UK and with him began the UK Government’s ground-breaking work on Open Data. That same year President Barack Obama announced his commitment to the US Open Government Initiative. In 2010 I went back to TED and showed the audience some of what had been achieved, including Open Street Map’s role in relief efforts in Haiti….
Open Platform
While it’s not really a feature of the Web, a concern for a lot of people is whether they can choose which apps run on their own phone or computer. An Open Platform means having the right to install and write software on your computer or device. One motivation to close off a computing platform comes from a manufacturer wanting to allow you to experience their content on your machine without being able to store it or pass it on. Some systems are very closed, in that the user can only watch a movie or play a game, with no chance to copy anything or back it up. Some systems are very open, allowing users to take copies of files and run any application they like. Many systems fall in between, letting users pay for additional material or an experience…
Open Source
“Open Source” is another way “open” is used on the web, one which has been and is very important to the Web’s growth. It’s important to me that I can get at the source code of any software I’m using. If I can get at the source code, can I modify it? Can I distribute the modified code and run it on my machine?  As Free Software Foundation lead Richard Stallman puts it, “free as in freedom rather than free as in beer”.
Open Access
Open Access is a Web-based movement specifically about free (as in beer) access to the body of academic learning. Governments, and therefore taxpayers, pay for research via grants but often the results of the research are kept in closed-access academic journals. The results are only available to those at big universities. The poor and those in remote rural areas cannot participate…
Open Internet and Net Neutrality
When we talk about keeping the internet free and open, we are often worried about blocking and spying. One of the ways in which we protect the Web is by ensuring Net Neutrality. Net Neutrality is about non-discrimination. Its principle is that if I pay to connect to the Net with a certain quality of service, and you pay to connect with that or a greater quality of service, then we can both communicate at the same level. This is important because it allows an open, fair market. It’s essential to an open, fair democracy. The alternative is a Web in which governments or large companies, or frequently a close association of the two, try to control the internet, with packets of information delivered in a way that discriminates for commercial or political reasons. Regimes of every sort spy on their citizens, deriving hugely accurate and detailed profiles of them and their intimate lives. Today, the battle is building.  The rights of individual people on the Web are being attacked, and at the moment only a few people really understand and realize what is going on.”

NEW Publication: “Reimagining Governance in Practice: Benchmarking British Columbia’s Citizen Engagement Efforts”


Over the last few years, the Government of British Columbia (BC), Canada has initiated a variety of practices and policies aimed at providing more legitimate and effective governance. Leveraging advances in technology, the BC Government has focused on changing how it engages with its citizens with the goal of optimizing the way it seeks input and develops and implements policy. The efforts are part of a broader trend among a wide variety of democratic governments to re-imagine public service and governance.
At the beginning of 2013, BC’s Ministry of Citizens’ Services and Open Government, now the Ministry of Technology, Innovation and Citizens’ Services, partnered with the GovLab to produce “Reimagining Governance in Practice: Benchmarking British Columbia’s Citizen Engagement Efforts.” The GovLab’s May 2013 report, made public today, makes clear that BC’s current practices to create a more open government, leverage citizen engagement to inform policy decisions, create new innovations, and provide improved public monitoring­—though in many cases relatively new—are consistently among the strongest examples at either the provincial or national level.
According to Stefaan Verhulst, Chief of Research at the GovLab: “Our benchmarking study found that British Columbia’s various initiatives and experiments to create a more open and participatory governance culture has made it a leader in how to re-imagine governance. Leadership, along with the elimination of imperatives that may limit further experimentation, will be critical moving forward. And perhaps even more important, as with all initiatives to re-imaging governance worldwide, much more evaluation of what works, and why, will be needed to keep strengthening the value proposition behind the new practices and polices and provide proof-of-concept.”
See also our TheGovLab Blog.