Five Ways to Make Government Procurement Better


Mark Headd at Civic Innovations:  “Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of healthcare.gov.
There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.
Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken…
With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.
Raise the threshold on simplified / streamlined procurement
Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.
Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.
Identify clear standards for projects
Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:

  • The various components that a system will use.
  • The environment in which it will be housed.
  • The testing it must undergo prior to final acceptance.

…can go a long way to reduce the risk an uncertainly inherent in IT projects.
It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.
Require open source
Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.
In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.
Develop a more robust internal capacity for IT project management and implementation
Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.
Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.
Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.
Make contracting, lobbying and campaign contribution data public as open data
One of the more disheartening revelations to come out of the analysis of healthcare.gov implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.
This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.
In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.
Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.
None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.
One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.”

Collaborative Internet Governance: Terms and Conditions of Analysis


New paper by Mathieu O’Neil in the special issue on Contested Internet Governance of the Revue française d’études américaines: “Online projects are communities of practice which attempt to bypass the hierarchies of everyday life and to create autonomous institutions and forms of organisation. A wealth of theoretical frameworks have been put forward to account for these networked actors’ capacity to communicate and self-organise. This article reviews terminology used in Internet research and assesses what it implies for the understanding of regulatory-oriented collective action. In terms of the environment in which interpersonal communication occurs, what differences does it make to speak of “public spheres” or of “public spaces”? In terms of social formations, of “organisations” or “networks”? And in terms of the diffusion of information over the global network, of “contagion” or “trajectories”? Selecting theoretical frames is a momentous decision for researchers, as it authorises or forbids the analysis of different types of behaviour and practices”.-
Other papers on Internet Governance in the Revue:
Divina Frau-Meigs  (Ed.).  Conducting Research on the Internet and its Governance
The Internet and its Governance: A General Bibliography
Glossary of Key Terms and Notions about Internet Governance
Julia Pohle et Luciano Morganti   The Internet Corporation for Assigned Names and Numbers (ICANN): Origins, Stakes and Tensions
Francesca Musiani et al.   Net Neutrality as an Internet Governance Issue: The Globalization of an American-Born Debate
Jeanette Hofmann   Narratives of Copyright Enforcement: The Upward Ratchet and the Sleeping Giant
Elizabeth Dubois et William H. Dutton   The Fifth Estate in Internet Governance: Collective Accountability of a Canadian Policy Initiative
Mathieu O’Neil   Collaborative Internet Governance: Terms and Conditions of Analysis
Peng Hwa Ang et Natalie Pang  Globalization of the Internet, Sovereignty or Democracy: The Trilemma of the Internet Governance Forum

Special issue of FirstMonday: "Making data — Big data and beyond"


Introduction by Rasmus Helles and Klaus Bruhn Jensen: “Data are widely understood as minimal units of information about the world, waiting to be found and collected by scholars and other analysts. With the recent prominence of ‘big data’ (Mayer–Schönberger and Cukier, 2013), the assumption that data are simply available and plentiful has become more pronounced in research as well as public debate. Challenging and reflecting on this assumption, the present special issue considers how data are made. The contributors take big data and other characteristic features of the digital media environment as an opportunity to revisit classic issues concerning data — big and small, fast and slow, experimental and naturalistic, quantitative and qualitative, found and made.
Data are made in a process involving multiple social agents — communicators, service providers, communication researchers, commercial stakeholders, government authorities, international regulators, and more. Data are made for a variety of scholarly and applied purposes, oriented by knowledge interests (Habermas, 1971). And data are processed and employed in a whole range of everyday and institutional contexts with political, economic, and cultural implications. Unfortunately, the process of generating the materials that come to function as data often remains opaque and certainly under–documented in the published research.
The following eight articles seek to open up some of the black boxes from which data can be seen to emerge. While diverse in their theoretical and topical focus, the articles generally approach the making of data as a process that is extended in time and across spatial and institutional settings. In the common culinary metaphor, data are repeatedly processed, rather than raw. Another shared point of attention is meta–data — the type of data that bear witness to when, where, and how other data such as Web searches, e–mail messages, and phone conversations are exchanged, and which have taken on new, strategic importance in digital media. Last but not least, several of the articles underline the extent to which the making of data as well as meta–data is conditioned — facilitated and constrained — by technological and institutional structures that are inherent in the very domain of analysis. Researchers increasingly depend on the practices and procedures of commercial entities such as Google and Facebook for their research materials, as illustrated by the pivotal role of application programming interfaces (API). Research on the Internet and other digital media also requires specialized tools of data management and analysis, calling, once again, for interdisciplinary competences and dialogues about ‘what the data show.’”
See Table of Contents

The Best American Infographics 2013


41DKY50w7vL._SX258_BO1,204,203,200_ New book by Gareth Cook:  “The rise of infographics across virtually all print and electronic media—from a striking breakdown of classic cocktails to a graphic tracking 200 influential moments that changed the world to visually arresting depictions of Twitter traffic—reveals patterns in our lives and our world in fresh and surprising ways. In the era of big data, where information moves faster than ever, infographics provide us with quick, often influential bursts of art and knowledge—on the environment, politics, social issues, health, sports, arts and culture, and more—to digest, to tweet, to share, to go viral.
The Best American Infographics captures the finest examples from the past year, including the ten best interactive infographics, of this mesmerizing new way of seeing and understanding our world.”
See also selection of some in Wired.
 

The transition towards transparency


Roland Harwood at the Open Data Institute Blog: “It’s a very exciting time for the field of open data, especially in the UK public sector which is arguably leading the world in this emerging discipline right now, in no small part thanks to the efforts to the Open Data Institute. There is a strong push to release public data and to explore new innovations that can be created as a result.
For instance, the Ordnance Survey have been leading the way with opening up half of their data for others to use, complemented by their GeoVation programme which provides support and incentive for external innovators to develop new products and services.
More recently the Technology Strategy Board have been working with the likes of NERC, Met Office, Environment Agency and other public agencies to help solve business problems using environmental data.
It goes without saying that data won’t leap up and create any value by itself any more than a pile of discarded parts outside a factory will assemble themselves into a car.   We’ve found that the secret of successful open data innovation is to be with people working to solve some specific problem.  Simply releasing the data is not enough. See below a summary of our Do’s and Don’ts of opening up data
Do…

  • Make sure data quality is high (ODI Certificates can help!)
  • Promote innovation using data sets. Transparency is only a means to an end
  • Enhance communication with external innovators
  • Make sure your co-creators are incentivised
  • Get organised, create a community around an issue
  • Pass on learnings to other similar organisations
  • Experiement – open data requires new mindsets and business models
  • Create safe spaces – Innovation Airlocks – to share and prototype with trusted partners
  • Be brave – people may do things with the data that you don’t like
  • Set out to create commercial or social value with data

Dont…

  • Just release data and expect people to understand or create with it. Publication is not the same as communication
  • Wait for data requests, put the data out first informally
  • Avoid challenges to current income streams
  • Go straight for the finished article, use rapid prototyping
  • Be put off by the tensions between confidentiality, data protection and publishing
  • Wait for the big budget or formal process but start big things with small amounts now
  • Be technology led, be business led instead
  • Expect the community to entirely self-manage
  • Restrict open data to the IT literate – create interdisciplinary partnerships
  • Get caught in the false dichotomy that is commercial vs. social

In summary we believe we need to assume openness as the default (for organisations that is, not individuals) and secrecy as the exception – the exact opposite to how most commercial organisations currently operate. …”

Using Participatory Crowdsourcing in South Africa to Create a Safer Living Environment


New Paper by Bhaveer Bhana, Stephen Flowerday, and Aharon Satt in the International Journal of Distributed Sensor Networks: “The increase in urbanisation is making the management of city resources a difficult task. Data collected through observations (utilising humans as sensors) of the city surroundings can be used to improve decision making in terms of managing these resources. However, the data collected must be of a certain quality in order to ensure that effective and efficient decisions are made. This study is focused on the improvement of emergency and non-emergency services (city resources) through the use of participatory crowdsourcing (humans as sensors) as a data collection method (collect public safety data), utilising voice technology in the form of an interactive voice response (IVR) system.
The study illustrates how participatory crowdsourcing (specifically humans as sensors) can be used as a Smart City initiative focusing on public safety by illustrating what is required to contribute to the Smart City, and developing a roadmap in the form of a model to assist decision making when selecting an optimal crowdsourcing initiative. Public safety data quality criteria were developed to assess and identify the problems affecting data quality.
This study is guided by design science methodology and applies three driving theories: the Data Information Knowledge Action Result (DIKAR) model, the characteristics of a Smart City, and a credible Data Quality Framework. Four critical success factors were developed to ensure high quality public safety data is collected through participatory crowdsourcing utilising voice technologies.”

New book: "Crowdsourcing"


New book by Jean-Fabrice Lebraty, Katia Lobre-Lebraty on Crowdsourcing: “Crowdsourcing is a relatively recent phenomenon that only appeared in 2006, but it continues to grow and diversify (crowdfunding, crowdcontrol, etc.). This book aims to review this concept and show how it leads to the creation of value and new business opportunities.
Chapter 1 is based on four examples: the online-banking sector, an informative television channel, the postal sector and the higher education sector. It shows that in the current context, for a company facing challenges, the crowd remains an untapped resource. The next chapter presents crowdsourcing as a new form of externalization and offers definitions of crowdsourcing. In Chapter 3, the authors attempt to explain how a company can create value by means of a crowdsourcing operation. To do this, authors use a model linking types of value, types of crowd, and the means by which these crowds are accessed.
Chapter 4 examines in detail various forms that crowdsourcing may take, by presenting and discussing ten types of crowdsourcing operation. In Chapter 5, the authors imagine and explore the ways in which the dark side of crowdsourcing might be manifested and Chapter 6 offers some insight into the future of crowdsourcing.
Contents
1. A Turbulent and Paradoxical Environment.
2. Crowdsourcing: A New Form of Externalization.
3. Crowdsourcing and Value Creation.
4. Forms of Crowdsourcing.
5. The Dangers of Crowdsourcing.
6. The Future of Crowdsourcing.”

Swarm-Based Medicine


Paul Martin Putora and Jan Oldenburgin the Journal of Medical Internet Research: “Humans, armed with Internet technology, exercise crowd intelligence in various spheres of social interaction ranging from predicting elections to company management. Internet-based interaction may result in different outcomes, such as improved response capability and decision-making quality.
The direct comparison of swarm-based medicine with evidence- or eminence-based is interesting, but these concepts should be perceived as complementing each other and working independently of each other. Optimal decision making depends on a balance of personal knowledge and swarm intelligence, taking into account the quality of each, with their weight in decisions being adapted accordingly. The possibility of balancing controversial standpoints and achieving acceptable conclusions for the majority of participants has been an important task of scientific and medical conferences since the Age of Enlightenment in the 17th and 18th centuries. Our swarm continues with this interconnecting synchronization at an unprecedented speed and is, thanks to eVotes, Internet forums, and the like, more reactive than ever. Faster changes in our direction of movement, like a school of fish, are becoming possible. Information spreads from one individual to another. It is unconscious, but with our own dance we influence the rest of the beehive.
Within an environment, individual behavior determines the behavior of the collective and vice versa. Internet technology has dramatically changed the environment we behave in. Traditionally, medical information was provided to patients as well as to physicians by experts. This intermediation was characterized by an expert standing between sources of information and the user. Currently, and probably even more so in the future, Web 2.0 and appropriate algorithms enable users to rely on the guidance or behavior of their peers in selecting and consuming information. This is one of many processes facilitated by medicine 2.0 and is described as “apomediation”. Apomediation, whether implicit or explicit, increases the influence of individuals on others. For an individual to adapt its behavior within a swarm, other individuals need to be perceived and their actions reacted upon. Through apomediation, more individuals take part in the swarm.
Our patients are better informed; second opinions can be sought via the Internet within hours. Our individual behavior is influenced by online resources as well as digital communication with our colleagues. This change in individual behavior influences the way we find, understand, and adopt guidelines. Societies representing larger groups within the swarms use this technology to create recommendations. This process is influenced by individuals and previous actions of the community; these then in return influence individual behavior. Information technology has a major impact on the lifecycle of guidelines and recommendations. There is no entry and exit point for IT in this regard. With increasing influence on individual behavior, its influence on collective behavior increases, influencing the other direction to the same extent.
Dynamic changes in movement of the swarm and within the swarm may lead to individuals leaving the herd. These may influence the herd to move in the direction of the outliers. At the same time, an individual leaving a flock or swarm is exposed. Physicians as well as clinical centers expose themselves when they leave the group for the sake of innovation. Negative results and failure might lead to legal exposure should treatments fail.
The perception of swarm behavior itself changes the way we approach guidelines. When several guidelines are published, being aware of them as a result of interaction increases our awareness for bias. Major deviations from other recommendations warrant scrutiny. The perception of swarm behavior and embracing the knowledge of the swarm may lead to an optimized use of resources. Information that has already been obtained may be incorporated directly by agents, enabling them to build on this and establish new knowledge—as social learning agents”

A Manifesto for Smart Citizens


Frank Kresin from the Waag Society: “We, citizens of all cities, take the fate of the places we live in into our own hands. We care about the familiar buildings and the parks, the shops, the schools, the roads and the trees, but far more about the quality of the life we live in them. About the casual interactions, uncalled for encounters, the craze and the booze and the love we lost and found. We know that our lives are interconnected, and what we do here will impact the outcomes over there. While we can never predict the eventual effect of our actions, we take full responsibility to make this world a better place.
Therefore, we will refuse to be consumers, client and informants only, and reclaim agency towards the processes, algorithms and systems that shape our world. We need to know how decisions are made, we need to have the information that is at hand; we need to have direct access to the people in power, and be involved in the crafting of laws and procedures that we grapple with everyday.
Fortunately, we are not alone. We are well educated and have appropriated the tools to connect at the touch of a button, organize ourselves, make our voices heard. We have the tools to measure ourselves and our environment, to visualize and analyse the data, to come to conclusions and take action. We have continuous access to the best of learning in the world, to powerful phones and laptops and software, and to home-grown labs that help us make the things that others won’t. Furthermore we were inspired by such diverse examples as the 1% club, Avaaz, Kickstarter, Couchsurfing, Change by Us, and many, many more.
We are ready. But government is not. It was shaped in the 18th century, but increasingly struggles with 21st century problems it cannot solve. It lost touch with its citizens and is less and less equipped to provide the services and security it had pledged to offer. While it tries to build ‘smart cities’ that reinforce or strengthen the status quo – that was responsible for the problems in the first place – it loses sight of the most valuable resource it can tap into: the smart citizen.
Smart Citizens:

  • Will take responsibility for the place they live, work and love in;
  • Value access over ownership, contribution over power;
  • Will ask forgiveness, not permission;
  • Know where they can get the tools, knowledge and support they need;
  • Value empathy, dialogue and trust;
  • Appropriate technology, rather than accept it as is;
  • Will help the people that struggle with smart stuff;
  • Ask questions, then more questions, before they come up with answers;
  • Actively take part in design efforts to come up with better solutions;
  • Work agile, prototype early, test quickly and know when to start over;
  • Will not stop in the face of seemingly huge boundariesbarriers;
  • Unremittingly share their knowledge and their learning, because they know this is where true value comes from.

All over the world, smart citizens take action. We self-organise, form cooperations, share resources and take back full responsibility for the care of our children and elderly. We pop up restaurants, harvest renewable energy, maintain urban gardens, build temporary structures and nurture compassion and trust. We kickstart the products and services we care about, repair and upcycle, or learn how to manufacture things ourselves. We even coined new currencies in response to events that recently shook our comfortable world, but were never solved by the powers that be.
Until now, we have mostly worked next to governments, sometimes against them, but hardly ever with them. As a result, many of the initiatives so far have been one-offs, inspiring but not game changing. We have put lots of energy into small-scale interventions that briefly flared and then returned to business as usual. Just imagine what will happen if our energy, passion and knowledge are teamed up by governments that know how to implement and scale up. Governments that take full responsibility for participating in the open dialogue that is needed to radically rethink the systems that were built decades ago.
One day we will wake up and realise WE ARE OUR GOVERNMENT. Without us, there is nobody there. As it takes a village to raise a child, it takes a people to craft a society. We know it can be done; it was done before. And with the help of new technologies it is easier than ever. So let’s actively set out to build truly smart cities, with smart citizens at their helms, and together become the change that we want to see in this world.”

Riding the Waves or Caught in the Tide? Navigating the Evolving Information Environment


IFLA Trend Report: “In the global information environment, time moves quickly and there’s an abundance of commentators trying to keep up. With each new technological development, a new report emerges assessing its impact on different sectors of society. The IFLA Trend Report takes a broader approach and identifies five high level trends shaping the information society, spanning access to education, privacy, civic engagement and transformation. Its findings reflect a year’s consultation with a range of experts and stakeholders from different disciplines to map broader societal changes occurring, or likely to occur in the information environment.
The IFLA Trend Report is more than a single document – it is a selection of resources to help you understand where libraries fit into a changing society.
From Five Key Trends Which Will Change Our Information Environment:
Trend 1:
New Technologies Will Both Expand and Limit Who Has Access to Information…
Trend 2:
Online Education Will Democratise and Disrupt Global Learning…
Trend 3:
The Boundaries of Privacy and Data Protection Will Be Redefined…
Trend 4:
Hyper-Connected Societies Will Listen to and Empower New Voices and Groups…In hyper-connected societies more opportunities for collective action are being realised – enabling the rise of new voices and promoting the growth of single-issue movements at the expense of traditional political parties. Open government initiatives and access to public sector data are leading to more transparency and citizen-focused public services.
Trend 5:
The Global Information Economy Will Be Transformed by New Technologies…”