Five Ways to Make Government Procurement Better


Mark Headd at Civic Innovations:  “Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of healthcare.gov.
There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.
Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken…
With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.
Raise the threshold on simplified / streamlined procurement
Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.
Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.
Identify clear standards for projects
Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:

  • The various components that a system will use.
  • The environment in which it will be housed.
  • The testing it must undergo prior to final acceptance.

…can go a long way to reduce the risk an uncertainly inherent in IT projects.
It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.
Require open source
Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.
In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.
Develop a more robust internal capacity for IT project management and implementation
Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.
Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.
Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.
Make contracting, lobbying and campaign contribution data public as open data
One of the more disheartening revelations to come out of the analysis of healthcare.gov implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.
This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.
In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.
Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.
None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.
One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.”

Democracy and Political Ignorance


Essay by Ilya Somin in Special issue on Is Smaller Government Smarter Government? of Cato Unbound: ” Democracy is supposed to be rule of the people, by the people, and for the people. But in order to rule effectively, the people need political knowledge. If they know little or nothing about government, it becomes difficult to hold political leaders accountable for their performance. Unfortunately, public knowledge about politics is disturbingly low. In addition, the public also often does a poor job of evaluating the political information they do know. This state of affairs has persisted despite rising education levels, increased availability of information thanks to modern technology, and even rising IQ scores. It is mostly the result of rational behavior, not stupidity. Such widespread and persistent political ignorance and irrationality strengthens the case for limiting and decentralizing the power of government….
Political ignorance in America is deep and widespread. The current government shutdown fight provides some good examples. Although Obamacare is at the center of that fight and much other recent political controversy, 44% percent of the public do not even realize it is still the law. Some 80 percent, according to a recent Kaiser survey, say they have heard “nothing at all” or “only a little” about the controversial insurance exchanges that are a major part of the law….
Some people react to data like the above by thinking that the voters must be stupid. Butpolitical ignorance is actually rational for most of the public, including most smart people. If your only reason to follow politics is to be a better voter, that turns out not be much of a reason at all. That is because there is very little chance that your vote will actually make a difference to the outcome of an election (about 1 in 60 million in a presidential race, for example).2 For most of us, it is rational to devote very little time to learning about politics, and instead focus on other activities that are more interesting or more likely to be useful. As former British Prime Minister Tony Blair puts it, “[t]he single hardest thing for a practising politician to understand is that most people, most  of the time, don’t give politics a first thought all day long. Or if they do, it is with a sigh…. before going back to worrying about the kids, the parents, the mortgage, the boss, their friends, their weight, their health, sex and rock ‘n’ roll.”3 Most people don’t precisely calculate the odds that their vote will make a difference. But they probably have an intuitive sense that the chances are very small, and act accordingly.
In the book, I also consider why many rationally ignorant people often still bother to vote.4 The key factor is that voting is a lot cheaper and less time-consuming than studying political issues. For many, it is rational to take the time to vote, but without learning much about the issues at stake….
Political ignorance is far from the only factor that must be considered in deciding the appropriate size, scope, and centralization of government. For example, some large-scale issues, such as global warming, are simply too big to be effectively addressed by lower-level governments or private organizations. Democracy and Political Ignorance is not a complete theory of the proper role of government in society. But it does suggest that the problem of political ignorance should lead us to limit and decentralize government more than we would otherwise.”
See also:  Ilya Somin, Democracy and Political Ignorance: Why Smaller Government is Smarter, (Stanford: Stanford University Press, 2013)

Sir Tim Berners-Lee: The many meanings of Open


Sir Tim Berners-Lee; ” I was recently asked to talk about the idea of “open”, and I realized the term is used in at least eight different ways. The distinct interpretations are all important in different but interlocking ways. Getting them confused leads to a lot of misunderstanding, so it’s good to review them all.
When we tease apart their meanings, we can understand more clearly which aspects of each are the most important. The first, one of the most important forms of openness for the Web, is its universality.
Universality – When I designed the Web protocols, I had already seen many networked information systems fail because they made some assumptions about the users – that they were using a particular type of computer for instance – or constrained the way they worked, such as forcing them to organize their data in a particular way, or to use a particular data format. The Web had to avoid these issues. The goal was that anyone should be able to publish anything on the Web and so it had to be universal in that it was independent of all these technical constraints, as well as language, character sets, and culture….
Open Standards
The actual design of the Web involved the creation of open standards – and getting people to agree to use them globally. The World Wide Web Consortium (W3C), of which I am the Director, helps create interoperable standards for Web technology, including HTML5, mobile Web, graphics, the Semantic Web of linked data, and Web accessibility. Any company can join and anyone can review and help create the specifications for the Web….
Open Web Platform (OWP)
W3C’s Open Web Platform is the name for a particular set of open standards which enable an exciting stage of Web computing. Standards such as HTML5, SVG, CSS, video, JavaScript, and others are advancing together so that programmes that once worked only on desktop, tablets or phones can now work from  within the browser itself. It has all the power of HTML5, like easily-inserted video and, in the future, easily-inserted conferences. It also features the APIs for accessing hardware and other capabilities on the device, such as a smartphone’s accelerometer, camera, and local storage. While native apps are limited, Web Apps can work on any platform….
Open Government through Open Data
In 2009, I resolved to encourage more use of data on the Web. Too many websites could generate nice reports as documents, but had no way to access the data behind it to check and build on the results.  In February that year I stood up in front of a TED audience and asked them for their data; I even got them to chant: “raw data now”.  In April that year, I met with Gordon Brown, then Prime Minister of the UK and with him began the UK Government’s ground-breaking work on Open Data. That same year President Barack Obama announced his commitment to the US Open Government Initiative. In 2010 I went back to TED and showed the audience some of what had been achieved, including Open Street Map’s role in relief efforts in Haiti….
Open Platform
While it’s not really a feature of the Web, a concern for a lot of people is whether they can choose which apps run on their own phone or computer. An Open Platform means having the right to install and write software on your computer or device. One motivation to close off a computing platform comes from a manufacturer wanting to allow you to experience their content on your machine without being able to store it or pass it on. Some systems are very closed, in that the user can only watch a movie or play a game, with no chance to copy anything or back it up. Some systems are very open, allowing users to take copies of files and run any application they like. Many systems fall in between, letting users pay for additional material or an experience…
Open Source
“Open Source” is another way “open” is used on the web, one which has been and is very important to the Web’s growth. It’s important to me that I can get at the source code of any software I’m using. If I can get at the source code, can I modify it? Can I distribute the modified code and run it on my machine?  As Free Software Foundation lead Richard Stallman puts it, “free as in freedom rather than free as in beer”.
Open Access
Open Access is a Web-based movement specifically about free (as in beer) access to the body of academic learning. Governments, and therefore taxpayers, pay for research via grants but often the results of the research are kept in closed-access academic journals. The results are only available to those at big universities. The poor and those in remote rural areas cannot participate…
Open Internet and Net Neutrality
When we talk about keeping the internet free and open, we are often worried about blocking and spying. One of the ways in which we protect the Web is by ensuring Net Neutrality. Net Neutrality is about non-discrimination. Its principle is that if I pay to connect to the Net with a certain quality of service, and you pay to connect with that or a greater quality of service, then we can both communicate at the same level. This is important because it allows an open, fair market. It’s essential to an open, fair democracy. The alternative is a Web in which governments or large companies, or frequently a close association of the two, try to control the internet, with packets of information delivered in a way that discriminates for commercial or political reasons. Regimes of every sort spy on their citizens, deriving hugely accurate and detailed profiles of them and their intimate lives. Today, the battle is building.  The rights of individual people on the Web are being attacked, and at the moment only a few people really understand and realize what is going on.”

Special issue of FirstMonday: "Making data — Big data and beyond"


Introduction by Rasmus Helles and Klaus Bruhn Jensen: “Data are widely understood as minimal units of information about the world, waiting to be found and collected by scholars and other analysts. With the recent prominence of ‘big data’ (Mayer–Schönberger and Cukier, 2013), the assumption that data are simply available and plentiful has become more pronounced in research as well as public debate. Challenging and reflecting on this assumption, the present special issue considers how data are made. The contributors take big data and other characteristic features of the digital media environment as an opportunity to revisit classic issues concerning data — big and small, fast and slow, experimental and naturalistic, quantitative and qualitative, found and made.
Data are made in a process involving multiple social agents — communicators, service providers, communication researchers, commercial stakeholders, government authorities, international regulators, and more. Data are made for a variety of scholarly and applied purposes, oriented by knowledge interests (Habermas, 1971). And data are processed and employed in a whole range of everyday and institutional contexts with political, economic, and cultural implications. Unfortunately, the process of generating the materials that come to function as data often remains opaque and certainly under–documented in the published research.
The following eight articles seek to open up some of the black boxes from which data can be seen to emerge. While diverse in their theoretical and topical focus, the articles generally approach the making of data as a process that is extended in time and across spatial and institutional settings. In the common culinary metaphor, data are repeatedly processed, rather than raw. Another shared point of attention is meta–data — the type of data that bear witness to when, where, and how other data such as Web searches, e–mail messages, and phone conversations are exchanged, and which have taken on new, strategic importance in digital media. Last but not least, several of the articles underline the extent to which the making of data as well as meta–data is conditioned — facilitated and constrained — by technological and institutional structures that are inherent in the very domain of analysis. Researchers increasingly depend on the practices and procedures of commercial entities such as Google and Facebook for their research materials, as illustrated by the pivotal role of application programming interfaces (API). Research on the Internet and other digital media also requires specialized tools of data management and analysis, calling, once again, for interdisciplinary competences and dialogues about ‘what the data show.’”
See Table of Contents

The move toward 'crowdsourcing' public safety


PhysOrg: “Earlier this year, Martin Dias, assistant professor in the D’Amore-McKim School of Business, presented research for the National Law Enforcement Telecommunications System in which he examined Nlets’ network and how its governance and technology helped enable inter-agency information sharing. This work builds on his research aimed at understanding design principles for this public safety “social networks” and other collaborative networks. We asked Dias to discuss how information sharing around public safety has evolved in recent years and the benefits and challenges of what he describes as “crowdsourcing public safety.” …

What is “crowdsourcing public safety” and why are public safety agencies moving toward this trend?
Crowdsourcing—the term coined by our own assistant professor of journalism Jeff Howe—involves taking a task or job traditionally performed by a distinct agent, or employee, and having that activity be executed by an “undefined, generally large group of people in an open call.” Crowdsourcing public safety involves engaging and enabling private citizens to assist public safety professionals in addressing natural disasters, terror attacks, organized crime incidents, and large-scale industrial accidents.
Public safety agencies have long recognized the need for citizen involvement. Tip lines and missing persons bulletins have been used to engage citizens for years, but with advances in mobile applications and big data analytics, the ability of to receive, process, and make use of high volume, tips, and leads makes crowdsourcing searches and investigations more feasible. You saw this in the FBI Boston Marathon Bombing web-based Tip Line. You see it in the “See Something Say Something” initiatives throughout the country. You see it in AMBER alerts or even remote search and rescue efforts. You even see it in more routine instances like Washington State’s HERO program to reduce traffic violations.
Have these efforts been successful, and what challenges remain?
There are a number of issues to overcome with regard to crowdsourcing public safety—such as maintaining privacy rights, ensuring data quality, and improving trust between citizens and officers. Controversies over the National Security Agency’s surveillance program and neighborhood watch programs – particularly the shooting death of teenager Trayvon Martin by neighborhood watch captain George Zimmerman, reflect some of these challenges. It is not clear yet from research the precise set of success criteria, but those efforts that appear successful at the moment have tended to be centered around a particular crisis incident—such as a specific attack or missing person. But as more crowdsourcing public safety mobile applications are developed, adoption and use is likely to increase. One trend to watch is whether national public safety programs are able to tap into the existing social networks of community-based responders like American Red Cross volunteers, Community Emergency Response Teams, and United Way mentors.
The move toward crowdsourcing is part of an overall trend toward improving community resilience, which refers to a system’s ability to bounce back after a crisis or disturbance. Stephen Flynn and his colleagues at Northeastern’s George J. Kostas Research Institute for Homeland Security are playing a key role in driving a national conversation in this area. Community resilience is inherently multi-disciplinary, so you see research being done regarding transportation infrastructure, social media use after a crisis event, and designing sustainable urban environments. Northeastern is a place where use-inspired research is addressing real-world problems. It will take a village to improve community resilience capabilities, and our institution is a vital part of thought leadership for that village.”
 

Twitter Datastream Used to Predict Flu Outbreaks


arXivBlog: “The rate at which people post flu-related tweets could become a powerful tool in the battle to spot epidemics earlier, say computer scientists.

Back in 2008, Google launched its now famous flu trends website. It works on the hypothesis that people make more flu-related search queries when they are suffering from the illness than when they are healthy. So counting the number of flu-related search queries in a given country gives a good indication of how the virus is spreading.
The predictions are pretty good. The data generally closely matches that produced by government organisations such as the Centers for Disease Control and Prevention (CDC) in the US. Indeed, in some cases, it has been able to spot an incipient epidemic more than a week before the CDC.
That’s been hugely important. An early indication that the disease is spreading in a population gives governments a welcome headstart in planning its response.
So an interesting question is whether other online services, in particular social media, can make similar or even better predictions. Today, we have an answer thanks to the work of Jiwei Li at Carnegie Mellon University in Pittsburgh, and Claire Cardie at Cornell University in New York State, who have been able to detect the early stages of an influenza outbreak using Twitter.
Their approach is in many ways similar to Google’s. They simply filter the Twitter datastream for flu-related tweets that are also geotagged. That allows them to create a map showing the distribution of these tweets and how it varies over time.
They also model the dynamics of the disease with some interesting subtleties. In the new model, a flu epidemic can be in one of four phases: non-epidemic phase, a rising phase where numbers are increasing, a stationary phase and a declining phase where numbers are falling.
The new approach uses an algorithm that attempts to spot the switch from one phase to another as early as possible. Indeed, Li and Cardie test the effectiveness of their approach using a Twitter dataset of 3.6 million flu-related tweets from about 1 million people in the US between June 2008 and June 2010…
Ref: arxiv.org/abs/1309.7340: Early Stage Influenza Detection from Twitter”

Data Discrimination Means the Poor May Experience a Different Internet


MIT Technology Review: “Data analytics are being used to implement a subtle form of discrimination, while anonymous data sets can be mined to reveal health data and other private information, a Microsoft researcher warned this morning at MIT Technology Review’s EmTech conference.
Kate Crawford, principal researcher at Microsoft Research, argued that these problems could be addressed with new legal approaches to the use of personal data.
In a new paper, she and a colleague propose a system of “due process” that would give people more legal rights to understand how data analytics are used in determinations made against them, such as denial of health insurance or a job. “It’s the very start of a conversation about how to do this better,” Crawford, who is also a visiting professor at the MIT Center for Civic Media, said in an interview before the event. “People think ‘big data’ avoids the problem of discrimination, because you are dealing with big data sets, but in fact big data is being used for more and more precise forms of discrimination—a form of data redlining.”
During her talk this morning, Crawford added that with big data, “you will never know what those discriminations are, and I think that’s where the concern begins.”

The Best American Infographics 2013


41DKY50w7vL._SX258_BO1,204,203,200_ New book by Gareth Cook:  “The rise of infographics across virtually all print and electronic media—from a striking breakdown of classic cocktails to a graphic tracking 200 influential moments that changed the world to visually arresting depictions of Twitter traffic—reveals patterns in our lives and our world in fresh and surprising ways. In the era of big data, where information moves faster than ever, infographics provide us with quick, often influential bursts of art and knowledge—on the environment, politics, social issues, health, sports, arts and culture, and more—to digest, to tweet, to share, to go viral.
The Best American Infographics captures the finest examples from the past year, including the ten best interactive infographics, of this mesmerizing new way of seeing and understanding our world.”
See also selection of some in Wired.
 

Crowdfunding in the EU – exploring the added value of potential EU action


Press Release: “Following the Workshop on Crowdfunding organised on 3 June 2013 in Brussels, the European Commission has today launched a consultation inviting stakeholders to share their views about crowdfunding: its potential benefits, risks, and the design of an optimal policy framework to untap the potential of this new form of financing…
Whereas many crowdfunding campaigns are local in nature, others would benefit from easier access to financing within a single European market. But to make sure crowdfunding is not just a momentary trend that fades away, but rather a sustainable source of financing for new European projects, certain safeguards are needed, in particular to ensure people’s trust. The ultimate objective of this consultation is to gather data about the needs of market participants and to identify the areas in which there is a potential added value in EU action to encourage the growth of this new industry, either through facilitative, soft-law measures or legislative action.
The consultation covers all forms of crowdfunding, ranging from donations and rewards to financial investments. Everyone is invited to share their opinion and fill in the on-line questionnaire, including citizens who might contribute to crowdfunding campaigns and entrepreneurs who might launch such campaigns. National authorities and crowdfunding platforms are also particularly encouraged to reply. The consultation will run until 31 December 2013.
See also MEMO/13/847
The consultation is available at:
http://ec.europa.eu/internal_market/consultations/2013/crowdfunding/index_en.htm
Further information:
Workshop on Crowdfunding – 3 June 2013
http://ec.europa.eu/internal_market/conferences/2013/0603-crowdfunding-workshop/
Commissioner Barnier’s speech at the Workshop on Crowdfunding
SPEECH/13/492″

If big data is an atomic bomb, disarmament begins in Silicon Valley


at GigaOM: “Big data is like atomic energy, according to scientist Albert-László Barabási in a Monday column on Politico. It’s very beneficial when used ethically, and downright destructive when turned into a weapon. He argues scientists can help resolve the damage done by government spying by embracing the principles of nuclear nonproliferation that helped bring an end to Cold War fears and distrust.
Barabási’s analogy is rather poetic:

“Powered by the right type of Big Data, data mining is a weapon. It can be just as harmful, with long-term toxicity, as an atomic bomb. It poisons trust, straining everything from human relations to political alliances and free trade. It may target combatants, but it cannot succeed without sifting through billions of data points scraped from innocent civilians. And when it is a weapon, it should be treated like a weapon.”

I think he’s right, but I think the fight to disarm the big data bomb begins in places like Silicon Valley and Madison Avenue. And it’s not just scientists; all citizens should have a role…
I write about big data and data mining for a living, and I think the underlying technologies and techniques are incredibly valuable, even if the applications aren’t always ideal. On the one hand, advances in machine learning from companies such as Google and Microsoft are fantastic. On the other hand, Facebook’s newly expanded Graph Search makes Europe’s proposed right-to-be-forgotten laws seem a lot more sensible.
But it’s all within the bounds of our user agreements and beauty is in the eye of the beholder.
Perhaps the reason we don’t vote with our feet by moving to web platforms that embrace privacy, even though we suspect it’s being violated, is that we really don’t know what privacy means. Instead of regulating what companies can and can’t do, perhaps lawmakers can mandate a degree of transparency that actually lets users understand how data is being used, not just what data is being collected. Great, some company knows my age, race, ZIP code and web history: What I really need to know is how it’s using that information to target, discriminate against or otherwise serve me.
An intelligent national discussion about the role of the NSA is probably in order. For all anyone knows,  it could even turn out we’re willing to put up with more snooping than the goverment might expect. But until we get a handle on privacy from the companies we choose to do business with, I don’t think most Americans have the stomach for such a difficult fight.”