Setting Government Procurement Data Free


Colin Wood in GovTech: “A new website may help drive down government procurement costs and make it easier for startups to sell their goods and services.

The website, called Open Procure, launched earlier this month and is the latest side project of Alan Mond, CEO and co-founder ofMunirent, the inter-jurisdictional equipment sharing service. Mond says the website is an experiment that he hopes will start conversations about procurement and ultimately prove beneficial for government and startups alike.

The website is simply a list of procurement thresholds for local and state government agencies nationwide. As of two weeks after launch, the website features thresholds for 59 agencies, many of which provide links to the original data sources. Users can see that in Boston, for instance, the city’s discretionary procurement threshold is $5,000 and the formal threshold is $25,000. So any startup wanting to sell goods or services to Boston — but avoid a public competitive bid process — can see that they need to keep their cost under $25,000. If they want to avoid competition altogether, they need to keep it under $5,000.

The website also creates a broader discussion around threshold inconsistency. In Philadelphia, for instance, the discretionary threshold is $32,000, compared to Boston’s $5,000, which means Philadelphia can procure without taking multiple bids on considerably larger projects. This is useful information for businesses, Mond pointed out, but also a conversation starter for the public sector. Do these disparities between different states, cities and counties exist for a good reason, or are they decided somewhat arbitrarily and left in the municipal code to rot?…(More)”

Demystifying the hackathon


Ferry Grijpink, Alan Lau, and Javier Vara at McKinsey: “The “hackathon” has become one of the latest vogue terms in business. Typically used in reference to innovation jams like those seen at Rails Rumble or TechCrunch Disrupt, it describes an event that pools eager entrepreneurs and software developers into a confined space for a day or two and challenges them to create a cool killer app. Yet hackathons aren’t just for the start-up tech crowd. Businesses are employing the same principles to break through organizational inertia and instill more innovation-driven cultures. That’s because they offer a baptism by fire: a short, intense plunge that assaults the senses and allows employees to experience creative disruption in a visceral way.

For large organizations in particular, hackathons can be adapted to greatly accelerate the process of digital transformation. They are less about designing new products and more about “hacking” away at old processes and ways of working. By giving management and others the ability to kick the tires of collaborative design practices, 24-hour hackathons can show that big organizations are capable of delivering breakthrough innovation at start-up speed. And that’s never been more critical: speed and agility are today central to driving business value,1 making hackathons a valuable tool for accelerating organizational change and fostering a quick-march, customercentric, can-do culture.

What it takes to do a good 24-hour hackathon

A 24-hour hackathon differs from more established brainstorming sessions in that it is all about results and jump-starting a way of working, not just idea generation. However, done well, it can help shave 25 to 50 percent from the time it takes to bring a service or product to market. The best 24-hour hackathons share several characteristics. They are:

  • Centered on the customer. A hackathon is focused on a single customer process or journey and supports a clear business target—for example, speed, revenue growth, or a breakthrough customer experience. It goes from the front to the back, starting with the customer experience and moving through various organizational and process steps that come into play to deliver on that interaction and the complete customer journey.
  • Deeply cross-functional. This is not just for the IT crowd. Hackathons bring together people from across the business to force different ways of working a problem. In addition to IT and top management, whose involvement as participants or as sponsors is critical, hackathon participants can include frontline personnel, brand leaders, user-experience specialists, customer service, sales, graphic designers, and coders. That assortment forces a range of perspectives to keep group think at bay while intense deadlines dispense with small talk and force quick, deep collaboration.
  • Starting from scratch. Successful hackathons deliberately challenge participants to reimagine an idealized method for addressing a given customer need, such as taking a paper-based, offline account-opening procedure and turning it into a simple, single-step, self-service online process. There’s an intentional irreverence in this disruption, too. Participants go in knowing that everything can and should be challenged. That’s liberating. The goal is to toss aside traditional notions of how things are done and reimagine the richest, most efficient way to improve the customer experience.
  • Concrete and focused on output. Sessions start with ideas but end with a working prototype that people can see and touch, such as clickable apps or a 3-D printed product (exhibit). Output also includes a clear development path that highlights all the steps needed, including regulatory, IT, and other considerations, to accelerate production and implementation. After an intense design workshop, which includes sketching a minimum viable product and overnight coding and development of the prototype, a 24-hour hackathon typically concludes with an experiential presentation to senior leaders. This management showcase includes a real-life demonstration of the new prototype and a roadmap of IT and other capabilities needed to bring the final version to market in under 12 weeks.
  • Iterative and continuous. Once teams agree on a basic experience, designers and coders go to work creating a virtual model that the group vets, refines and re-releases in continual cycles until the new process or app meets the desired experience criteria. When hackathons end, there is usually a surge of enthusiasm and energy. But that energy can dissipate unless management puts in place new processes to sustain the momentum. That includes creating mechanisms for frontline employees to report back on progress and rewards for adopting new behaviors….(More)”

Statactivism: Forms of Action between Disclosure and Affirmation


Paper by Bruno Isabelle, Didier Emmanuel and Vitale Tommaso: “This article introduces the special issue on statactivism, a particular form of action within the repertoire used by contemporary social movements: the mobilization of statistics. Traditionally, statistics has been used by the worker movement within the class conflicts. But in the current configuration of state restructuring, new accumulation regimes, and changes in work organization in capitalists societies, the activist use of statistics is moving. This first article seeks to show the use of statistics and quantification in contentious performances connected with state restructuring, main transformations of the varieties of capitalisms, and changes in work organization regimes. The double role of statistics in representing as well as criticizing reality is considered. After showing how important statistical tools are in producing a shared reading of reality, we will discuss the two main dimensions of statactivism – disclosure and affirmation. In other words, we will see the role of stat-activists in denouncing a certain state of reality, and then the efforts to use statistics in creating equivalency among disparate conditions and in cementing emerging social categories. Finally, we present the main contributions of the various research papers in this special issue regarding the use of statistics as a form of action within a larger repertoire of contentious action. Six empirical papers focus on statactivism against the penal machinery in the early 1970s (Grégory Salle), on the mobilisation on the price index in Guadalupe in 2009 (Boris Samuel), and in Argentina in 2007 (Celia Lury and Ana Gross), on the mobilisations of experts to consolidate a link between working conditions and health issues (Marion Gilles), on the production of activity data for disability policy in France (Pierre-Yves Baudot), and on the use of statistics in social mobilizations for gender equality (Eugenia De Rosa). Alain Desrosières wrote the last paper, coping with mobilizations proposing innovations in the way of measuring inflation, unemployment, poverty, GDP, and climate change. This special issue is dedicated to him, in order to honor his everlasting intellectual legacy….(More)”

 

Lawyer’s crowdsourcing site aims to help people have their day in court


 in The Guardian: “With warnings coming thick and fast about the stark ramifications of the government’s sweeping cuts to legal aid, it was probably inevitable that someone would come up with a new way to plug some gaps in access to justice. Enter the legal crowdfunder, CrowdJustice, an online platform where people who might not otherwise get their case heard can raise cash to pay for legal representation and court costs.

The brainchild of 33-year-old lawyer Julia Salasky, and the first of its kind in the UK, CrowdJustice provides people who have a public interest case but lack adequate financial resources with a forum where they can publicise their case and, if all goes to plan, generate funding for legal action by attracting public support and donations.

“We are trying to increase access to justice – that’s the baseline,” says Salasky. “I think it’s a social good.”

The platform was launched just a few months ago, but has already attracteda range of cases both large and small, including some that could set important legal precedents.

CrowdJustice has helped the campaign, Jengba (Joint Enterprise: Not Guilty by Association) to raise funds to intervene in a supreme court case to consider reforming the law of joint enterprise that can find people guilty of a crime, including murder, committed by someone else. The group amassed £10,000 in donations for legal assistance as part of their ongoing challenge to the legal doctrine of “joint enterprise”, which disproportionately prosecutes people from black and minority ethnic backgrounds for violent crimes where it is alleged they have acted together for a common purpose.

In another case, a Northern Irish woman who discovered she wasn’t entitled to her partner’s occupational pension after he died because of a bureaucratic requirement that did not apply to married couples, used CrowdJustice to help raise money to take her case all the way to the supreme court. “If she wins, it will have an enormous precedent-setting value for the legal rights of all couples who cohabit,” Salasky says….(The Guardian)”

Partnership Governance in Public Management


A Public Solutions Handbook y Seth A. Grossman, Marc Holzer: “The ability to create and sustain partnerships is a skill and a strategic capacity that utilizes the strengths and offsets the weaknesses of each actor. Partnerships between the public and private sectors allow each to enjoy the benefits of the other: the public sector benefits from increased entrepreneurship and the private sector utilizes public authority and processes to achieve economic and community revitalization. Partnership Governance in Public Management describes what partnership is in the public sector, as well as how it is managed, measured, and evaluated. Both a theoretical and practical text, this book is a what, why, and how examination of a key function of public management.Examining governing capacity, community building, downtown revitalization, and partnership governance through the lens of formalized public-private partnerships – specifically, how these partnerships are understood and sustained in our society – this book is essential reading for students and practitioners with an interest in partnership governance and public administration and management more broadly. Chapters explore partnering technologies as a way to bridge sectors, to produce results and a new sense of public purpose, and to form a stable foundation for governance to flourish….(More)”

Privacy Bridges: EU and US Privacy Experts in Search of Transatlantic Privacy Solutions


IVIR and MIT: “The EU and US share a common commitment to privacy protection as a cornerstone of democracy. Following the Treaty of Lisbon, data privacy is a fundamental right that the European Union must proactively guarantee. In the United States, data privacy derives from constitutional protections in the First, Fourth and Fifth Amendment as well as federal and state statute, consumer protection law and common law. The ultimate goal of effective privacy protection is shared. However, current friction between the two legal systems poses challenges to realizing privacy and the free flow of information across the Atlantic. Recent expansion of online surveillance practices underline these challenges.

Over nine months, the group prepared a consensus report outlining a menu of privacy “bridges” that can be built to bring the European Union and the United States closer together. The efforts are aimed at providing a framework of practical options that advance strong, globally-accepted privacy values in a manner that respects the substantive and procedural differences between the two jurisdictions….

(More)”

Room for a View: Democracy as a Deliberative System


Involve: “Democratic reform comes in waves, propelled by technological, economic, political and social developments. There are periods of rapid change, followed by relative quiet.

We are currently in a period of significant political pressure for change to our institutions of democracy and government. With so many changes under discussion it is critically important that those proposing and carrying out reforms understand the impact that different reforms might have.

Most discussions of democratic reform focus on electoral democracy. However, for all their importance in the democratic system, elections rarely reveal what voters think clearly enough for elected representatives to act on them. Changing the electoral system will not alone significantly increase the level of democratic control held by citizens.

Room for a View, by Involve’s director Simon Burall, looks at democratic reform from a broader perspective than that of elections. Drawing on the work of democratic theorists, it uses a deliberative systems approach to examine the state of UK democracy. Rather than focusing exclusively on the extent to which individuals and communities are represented within institutions, it is equally concerned with the range of views present and how they interact.

Adapting the work of the democratic theorist John Dryzek, the report identifies seven components of the UK’s democratic system, describing and analysing the condition of each in turn. Assessing the UK’s democracy though this lens reveals it to be in fragile health. The representation of alternative views and narratives in all of the UK system’s seven components is poor, the components are weakly connected and, despite some positive signs, deliberative capacity is decreasing.

Room for a View suggests that a focus on the key institutions isn’t enough. If the health of UK democracy is to be improved, we need to move away from thinking about the representation of individual voters to thinking about the representation of views, perspectives and narratives. Doing this will fundamentally change the way we approach democratic reform.

Open data, open mind: Why you should share your company data with the world


Mark Samuels at ZDnet: “If information really is the lifeblood of modern organisations, then CIOs could create huge benefits from opening their data to new, creative pairs of eyes. Research from consultant McKinsey suggests that seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data: that is, taking previously proprietary data (often starting with public sector data) and opening up access.

So, should your business consider giving outsiders access to insider information? ZDNet speaks to three experts.

More viewpoints can mean better results

Former Tullow Oil CIO Andrew Marks says debates about the potential openness of data in a private sector context are likely to be dominated by one major concern: information security.

“It’s a perfectly reasonable debate until people start thinking about privacy,” he says. “Putting information at risk, both in terms of customer data and competitive advantage, will be a risk too far for many senior executives.”

But what if CIOs could allay c-suite peers’ concerns and create a new opportunity? Marks points to the Goldcorp Challenge, which saw the mining specialist share its proprietary geological data to allow outside experts pick likely spots for mining. The challenge, which included prize money of $575,000 helped identify more than 110 sites, 50 per cent of which were previously unknown to the company. The value of gold found through the competition exceeded $6bn. Marks wonders whether other firms could take similarly brave steps.
“There is a period of time when information is very sensitive,” he says. “Once the value of data starts to become finite, then it might be beneficial for businesses to open the doors and to let outsiders play with the information. That approach, in terms of gamification, might lead to the creation of new ideas and innovations.”…

Marks says these projects help prove that, when it comes to data, more is likely to mean different – and possibly better – results. “Whether using big data algorithms or the human touch, the more viewpoints you bring together, the more you can increases chances of success and reduce risk,” he says.

“There is, therefore, always likely to be value in seeking an alternative perspective. Opening access to data means your firm is going to get more ideas, but CIOs and other senior executives need to think very carefully about what such openness means for the business, and the potential benefits.”….Some leading firms are already taking steps towards openness. Take Christina Scott, chief product and information officer at the Financial Times, who says the media organisation has used data analysts to help push the benefits of information-led insight across the business.

Her team has democratised data in order to make sure that all parts of the organisation can get the information they need to complete their day-to-day jobs. Scott says the approach is best viewed as an open data strategy, but within the safe confines of the existing enterprise firewall. While the tactic is internally focused currently, Scott says the FT is keen to find ways to make the most of external talent in the future.

“We’re starting to consider how we might open data beyond the organisation, too,” she says. “Our data holds a lot of value and insight, including across the metadata we’ve created. So it would be great to think about how we could use that information in a more open way.” Part of the FT’s business includes trade-focused magazines. Scott says opening the data could provide new insight to its B2B customers across a range of sectors. In fact, the firm has already dabbled at a smaller scale.

“We’ve run hackathons, where we’ve exposed our APIs and given people the chance to come up with some new ideas,” she says. “But I don’t think we’ve done as much work on open data as we could. And I think that’s the direction in which better organisations are moving. They recognise that not all innovation is going to happen within the company.”…

CIO Omid Shiraji is another IT expert who recognises that there is a general move towards a more open society. Any executive who expects to work within a tightly defined enterprise firewall is living in cloud cuckoo land, he argues. More to the point, they will miss out on big advantages.
“If you can expose your sources to a range of developers, you can start to benefit from massive innovation,” he says. “You can get really big benefits from opening your data to external experts who can focus on areas that you don’t have the capability to develop internally.”

Many IT leaders would like to open data to outside experts, suggests Shiraji. For CIOs who are keen to expose their sources, he suggests letting small-scale developers take a close look at in-house data silos in an attempt to discover what relationships might exist and what advantages could accrue….(More)”

Introducing Government as a Platform


Peter Williams, Jan Gravesen and Trinette Brownhill in Government Executive: “Governments around the world are facing competitive pressures and expectations from their constituents that are prompting them to innovate and dissolve age-old structures. Many governments have introduced a digital strategy in which at least one of the goals is aimed at bringing their organizations closer to citizens and businesses.

To achieve this, ideally IT and data in government would not be constrained by the different functional towers that make up the organization, as is often the case. They would not be constrained by complex, monolithic application design philosophies and lengthy implementation cycles, nor would development be constrained by the assumption that all activity has to be executed by the government itself.

Instead, applications would be created rapidly and cheaply, and modules would be shared as reusable blocks of code and integrated data. It would be relatively straightforward to integrate data from multiple departments to enable a focus on the complex needs of, say, a single parent who is diabetic and a student. Delivery would be facilitated in the manner best required, or preferred, by the citizen. Third parties would also be able to access these modules of code and data to build higher value government services that multiple agencies would then buy into. The code would run on a cloud infrastructure that maximizes the efficiency in which processing resources are used.

GaaP an organized set of ideas and principles that allows organizations to approach these ideals. It allows governments to institute more efficient sharing of IT resources as well as unlock data and functionality via application programming interfaces to allow third parties to build higher value citizen services. In doing so, security plays a crucial role protecting the privacy of constituents and enterprise assets.

We see increasingly well-established examples of GaaP services in many parts of the world. The notion has significantly influenced strategic thinking in the UK, Australia, Denmark, Canada and Singapore. In particular, it has evolved in a deliberate way in the UK’s Government Data Services, building on the Blairite notion of “joined up government”; in Australia’s e-government strategy and its myGov program; and as a significant influencer in Singapore’s entire approach to building its “smarter nation” infrastructure.

Collaborative Government

GaaP assumes a transformational shift in efficiency, effectiveness and transparency, in which agencies move toward a collaborative government and away from today’s siloed approach. That collaboration may be among agencies, but also with other entities (nongovernmental organizations, the private sector, citizens, etc.).

GaaP’s focus on collaboration enables public agencies to move away from their traditional towered approach to IT and increasingly make use of shared and composable services offered by a common – usually a virtualized, cloud-enabled – platform. This leads to more efficient use of development resources, platforms and IT support. We are seeing examples of this already with a group of townships in New York state and also with two large Spanish cities that are embarking on this approach.

While efficient resource and service sharing is central to the idea of GaaP, it is not sufficient. The idea is that GaaP must allow app developers, irrespective of whether they are citizens, private organizations or other public agencies, to develop new value-added services using published government data and APIs. In this sense, the platform becomes a connecting layer between public agencies’ systems and data on the one hand, and private citizens, organizations and other public agencies on the other.

In its most fundamental form, GaaP is able to:

  • Consume data and government services from existing departmental systems.
  • Consume syndicated services from platform-as-a-service or software-as-a-service providers in the public marketplace.
  • Securely unlock these data and services and allow third parties –citizens, private organizations or other agencies – to combine services and data into higher-order services or more citizen-centric or business-centric services.

It is the openness, the secure interoperability, and the ability to compose new services on the basis of existing services and data that define the nature of the platform.

The Challenges

At one time, the challenge of creating a GaaP structure would have been technology: Today, it is governance….(More)”

Big data problems we face today can be traced to the social ordering practices of the 19th century.


Hamish Robertson and Joanne Travaglia in LSE’s The Impact Blog: “This is not the first ‘big data’ era but the second. The first was the explosion in data collection that occurred from the early 19th century – Hacking’s ‘avalanche of numbers’, precisely situated between 1820 and 1840. This was an analogue big data era, different to our current digital one but characterized by some very similar problems and concerns. Contemporary problems of data analysis and control include a variety of accepted factors that make them ‘big’ and these generally include size, complexity and technology issues. We also suggest that digitisation is a central process in this second big data era, one that seems obvious but which has also appears to have reached a new threshold. Until a decade or so ago ‘big data’ looked just like a digital version of conventional analogue records and systems. Ones whose management had become normalised through statistical and mathematical analysis. Now however we see a level of concern and anxiety, similar to the concerns that were faced in the first big data era.

This situation brings with it a socio-political dimension of interest to us, one in which our understanding of people and our actions on individuals, groups and populations are deeply implicated. The collection of social data had a purpose – understanding and controlling the population in a time of significant social change. To achieve this, new kinds of information and new methods for generating knowledge were required. Many ideas, concepts and categories developed during that first data revolution remain intact today, some uncritically accepted more now than when they were first developed. In this piece we draw out some connections between these two data ‘revolutions’ and the implications for the politics of information in contemporary society. It is clear that many of the problems in this first big data age and, more specifically, their solutions persist down to the present big data era….Our question then is how do we go about re-writing the ideological inheritance of that first data revolution? Can we or will we unpack the ideological sequelae of that past revolution during this present one? The initial indicators are not good in that there is a pervasive assumption in this broad interdisciplinary field that reductive categories are both necessary and natural. Our social ordering practices have influenced our social epistemology. We run the risk in the social sciences of perpetuating the ideological victories of the first data revolution as we progress through the second. The need for critical analysis grows apace not just with the production of each new technique or technology but with the uncritical acceptance of the concepts, categories and assumptions that emerged from that first data revolution. That first data revolution proved to be a successful anti-revolutionary response to the numerous threats to social order posed by the incredible changes of the nineteenth century, rather than the Enlightenment emancipation that was promised. (More)”

This is part of a wider series on the Politics of Data. For more on this topic, also see Mark Carrigan’sPhilosophy of Data Science interview series and the Discover Society special issue on the Politics of Data (Science).