Privacy in the 21st Century: From the “Dark Ages” to “Enlightenment”?


Paper by P. Kitsos and A. Yannoukakou in the International Journal of E-Politics (IJEP): “The events of 9/11 along with the bombarding in Madrid and London forced governments to resort to new structures of privacy safeguarding and electronic surveillance under the common denominator of terrorism and transnational crime fighting. Legislation as US PATRIOT Act and EU Data Retention Directive altered fundamentally the collection, processing and sharing methods of personal data, while it granted increased powers to police and law enforcement authorities concerning their jurisdiction in obtaining and processing personal information to an excessive degree. As an aftermath of the resulted opacity and the public outcry, a shift is recorded during the last years towards a more open governance by the implementation of open data and cloud computing practices in order to enhance transparency and accountability from the side of governments, restore the trust between the State and the citizens, and amplify the citizens’ participation to the decision-making procedures. However, privacy and personal data protection are major issues in all occasions and, thus, must be safeguarded without sacrificing national security and public interest on one hand, but without crossing the thin line between protection and infringement on the other. Where this delicate balance stands, is the focal point of this paper trying to demonstrate that it is better to be cautious with open practices than hostage of clandestine practices.”

Selected Readings on Crowdsourcing Opinions and Ideas


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing was originally published in 2013.

As technological advances give individuals greater ability to share their opinions and ideas with the world, citizens are increasingly expecting government to consult with them and factor their input into the policy-making process. Moving away from the representative democracy system created in a less connected time, e-petitions; participatory budgeting (PB), a collaborative, community-based system for budget allocation; open innovation initiatives; and Liquid Democracy, a hybrid of direct and indirect democracy, are allowing citizens to make their voices heard between trips to the ballot box.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Bergmann, Eirikur. “Reconstituting Iceland – Constitutional Reform Caught in a New Critical Order in the Wake of Crisis.” in Academia.edu, (presented at the Political Legitimacy and the Paradox of Regulation, Leiden University, 2013). http://bit.ly/1aaTVYP.
  •  This paper explores the tumultuous history of Iceland’s “Crowdsourced Constitution.” The since-abandoned document was built upon three principles: distribution of power, transparency and responsibility.
  •  Even prior to the draft being dismantled through political processes, Bergmann argues that an overenthusiastic public viewed the constitution as a stronger example of citizen participation than it really was: “Perhaps with the delusion of distance the international media was branding the production as the world’s first ‘crowdsourced’ constitution, drafted by the interested public in clear view for the world to follow…This was however never a realistic description of the drafting. Despite this extraordinary open access, the Council was not able to systematically plough through all the extensive input as [it] only had four months to complete the task.”
  • Bergmann’s paper illustrates the transition Iceland’s constitution has undertaken in recent years: moving form a paradigmatic example of crowdsourcing opinions to a demonstration of the challenges inherent in bringing more voices into a realm dominated by bureaucracy and political concerns.
Gassmann, Oliver, Ellen Enkel, and Henry Chesbrough. “The Future of Open Innovation.” R&D Management 40, no. 3 (2010): 213– 221. http://bit.ly/1bk4YeN.
  • In this paper – an introduction to a special issue on the topic – Gassmann, Enkel and Chesbrough discuss the evolving trends in open innovation. They define the concept, referencing previous work by Chesbrough et al., as “…the purposive inflows and outflows of knowledge to accelerate internal innovation, and expand the markets for external use of innovation, respectively.”
  • In addition to examining the existing literature for the field, the authors identify nine trends that they believe will define the future of open innovation for businesses, many of which can also be applied to governing insitutions:
    • Industry penetration: from pioneers to mainstream
    • R&D intensity: from high to low tech
    • Size: from large firms to SMEs
    • Processes: from stage gate to probe-and-learn
    • Structure: from standalone to alliances
    • Universities: from ivory towers to knowledge brokers  Processes: from amateurs to professionals
    • Content: from products to services
    • Intellectual property: from protection to a tradable good
Gilman, Hollie Russon. “The Participatory Turn: Participatory Budgeting Comes to America.” Harvard University, 2012. https://bit.ly/2BhaeVv.
  •  In this dissertation, Gilman argues that participatory budgeting (PB) produces better outcomes than the status quo budget process in New York, while also transforming how those who participate understand themselves as citizens, constituents, Council members, civil society leaders and community stakeholders.
  • The dissertation also highlights challenges to participation drawing from experience and lessons learned from PB’s inception in Porto Alege, Brazil in 1989. While recognizing a diversity of challenges, Gilman ultimately argues that, “PB provides a viable and informative democratic innovation for strengthening civic engagement within the United States that can be streamlined and adopted to scale.”
Kasdan, Alexa, and Cattell, Lindsay. “New Report on NYC Participatory Budgeting.” Practical Visionaries. Accessed October 21, 2013. https://bit.ly/2Ek8bTu.
  • This research and evaluation report is the result of surveys, in-depth interviews and observations collected at key points during the 2011 participatory budgeting (PB) process in New York City, in which “[o]ver 2,000 community members were the ones to propose capital project ideas in neighborhood assemblies and town hall meetings.”
  • The PBNYC project progressed through six main steps:
    •  First Round of Neighborhood Assemblies
    • Delegate Orientations
    • Delegate Meetings
    • Second Round of Neighborhood Assemblies
    • Voting
    • Evaluation, Implementation & Monitoring
  •  The authors also discuss the varied roles and responsibilities for the divers stakeholders involved in the process:
    • Community Stakeholders
    • Budget Delegates
    • District Committees
    • City-wide Steering Committee  Council Member Offices
Masser, Kai. “Participatory Budgeting as Its Critics See It.” Burgerhaushalt, April 30, 2013. http://bit.ly/1dppSxW.
  • This report is a critique of the participatory budgeting (PB) process, focusing on lessons learned from the outcomes of a pilot initiative in Germany.
  • The reports focuses on three main criticisms leveled against PB:
    • Participatory Budgeting can be a time consuming process that is barely comprehensive to the people it seeks to engage, as a result there is need for information about the budget, and a strong willingness to participate in preparing it.
    • Differences in the social structure of the participants inevitably affect the outcome – the process must be designed to avoid low participation or over-representation of one group.
    • PB cannot be sustained over a prolonged period and should therefore focus on one aspect of the budgeting process. The article points to outcomes that show that citizens may find it considerably more attractive to make proposals on how to spend money than on how to save it, which may not always result in the best outcomes.
OECD. “Citizens as Partners: Information, Consultation and Public Participation in Policy-making.” The IT Law Wiki. http://bit.ly/1aIGquc.
  • This OECD policy report features discussion on the concept of crowdsourcing as a new form or representation and public participation in OECD countries, with the understanding that it creates avenues for citizens to participate in public policy-making within the overall framework of representative democracy.
  • The report provides a wealth of comparative information on measures adopted in OECD countries to strengthen citizens’ access to information, to enhance consultation and encourage their active participation in policy-making.

Tchorbadjiiski, Angel. “Liquid Democracy.” Rheinisch-Westf alische Technische Hochschule Aachen Informatik 4 ComSy, 2012. http://bit.ly/1eOsbIH.

  • This thesis presents discusses how Liquid Democracy (LD) makes it for citizens participating in an election to “either take part directly or delegate [their] own voting rights to a representative/expert. This way the voters are not limited to taking one decision for legislative period as opposed to indirect (representative) democracy, but are able to actively and continuously take part in the decision-making process.”
  • Tchorbadjiiski argues that, “LD provides great flexibility. You do not have to decide yourself on the program of a political party, which only suits some aspects of your opinion.” Through LD, “all voters can choose between direct and indirect democracy creating a hybrid government form suiting their own views.”
  • In addition to describing the potential benefits of Liquid Democracy, Tchorbadjiiski focuses on the challenge of maintaining privacy and security in such a system. He proposes a platform that “allows for secure and anonymous voting in such a way that it is not possible, even for the system operator, to find out the identity of a voter or to prevent certain voters (for example minority groups) from casting a ballot.”

Index: Trust in Institutions


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on trust in institutions and was originally published in 2013.

Trust in Government

  • How many of the global public feel that their governments listen to them: 17%
  • How much of the global population trusts in institutions: almost half 
  • The number of Americans who trust institutions: less than half
  • How many people globally believe that business leaders and government officials will tell the truth when confronted with a difficult issue: Less than one-fifth
  • The average level of confidence amongst citizens in 25 OECD countries:
    • In national government: 40%, down from 45% in 2007
    • In financial institutions: 43%
    • In public services such as local police and healthcare: 72% and 71% respectively

Executive Government

  • How many Americans trust the government in Washington to do what is right “just about always or most of the time” in September 2013: 19%
  • Those who trust the “men and women … who either hold or are running for public office”: 46%
  • Number of Americans who express a great deal or fair amount of trust in:
    • Local government: 71%
    • State government: 62%
    • Federal government: 52%
  • How many Americans trust in the ability of “the American people” to make judgments about political issues facing the country:  61%, declining every year since 2009
  • Those who have trust and confidence in the federal government’s ability to handle international problems: 49%
  • Number of Americans who feel “angry” at the federal government: 3 in 10, all-time high since first surveyed in 1997

Congress

  • Percentage of Americans who say “the political system can work fine, it’s the members of Congress that are the problem” in October 2013: 58%
  • Following the government shutdown, number of Americans who stated that Congress would work better if nearly every member was replaced next year: nearly half
  • Those who think that even an entire overhaul of Congress would not make much difference: 4 in 10 
  • Those who think that “most members of Congress have good intentions, it’s the political system that is broken” in October 2013: 32%

Trust in Media

  • Global trust in media (traditional, social, hybrid, owned, online search): 57% and rising
  • The percentage of Americans who say they have “a great deal or fair amount of trust and confidence in the mass media”: 44% – the lowest level since first surveyed in 1997
  • How many Americans see the mass media as too liberal: 46%
    • As too conservative: 13%
    • As “just about right”: 37%
  • The number of Americans who see the press as fulfilling the role of political watchdog and believe press criticism of political leaders keeps them from doing things that should not be done: 68%
  • The proportion of Americans who have “only a little/not at all” level of trust in Facebook to protect privacy and personal information: three in four
    • In Google: 68%
    • In their cell phone provider: 63%

Trust in Industry

  • Global trust in business: 58%
  • How much of the global public trusts financial institutions: 50%
  • Proportion of the global public who consider themselves informed about the banking scandals: more than half
  • Of those, how many Americans report they now trust banks less: almost half
  • Number of respondents globally who say they trust tech companies to do what’s right: 77%, most trusted industry
  • Number of consumers across eight markets who were “confident” or “somewhat confident” that the tech sector can provide long-term solutions to meet the world’s toughest challenges: 76%

Sources

New Report: Federal Ideation Program: Challenges and Best Practices


New Report by Professor Gwanhoo Lee for the IBM Center for The Business of Government: “Ideation is the process of generating new ideas or solutions using crowdsourcing technologies, and it is changing the way federal government agencies innovate and solve problems. Ideation tools use online brainstorming or social voting platforms to submit new ideas, search previously submitted ideas, post questions and challenges, discuss and expand on ideas, vote them up or down and flag them.
This report examines the current status, challenges, and best practices of federal internal ide­ation programs made available exclusively to employees. Initial experiences from a variety of agencies show that these ideation tools hold great promise in engaging employees and stake­holders in problem-solving.
While ideation programs offer promising benefits, making innovation an aspect of everyone’s job is very hard to achieve. Given that these ideation tools and programs are still relatively new, agencies have not yet figured out the best practices and often do not know what to expect during the implementation process. This report seeks to fill this gap.
Based on field research and a literature review, the report describes four federal internal ideation programs, including IdeaHub (Department of Transportation), the Sounding Board (the Department of State), IdeaFactory (Department of Homeland Security), and CDC IdeaLab (Centers for Disease Control and Prevention, Department of Health and Human Services).
Four important challenges are associated with the adoption and implementation of federal internal ideation programs. These are: managing the ideation process and technology; managing cultural change; managing privacy, security and transparency; and managing use of the ideation tool.
Federal government agencies have been moving in the right direction by embracing these tools and launching ideation programs in boosting employee-driven innovation. However, many daunting challenges and issues remain to be addressed. For a federal agency to sustain its internal ideation program, it should note the following:
Recommendation One: Treat the ideation program not as a management fad but as a vehicle to reinvent the agency.
Recommendation Two: Institutionalize the ideation program.
Recommendation Three: Make the ideation team a permanent organizational unit.
Recommendation Four: Document ideas that are implemented.Quantify their impact and demonstrate the return on investment.Share the return with the employees through meaningful rewards.
Recommendation Five: Assimilate and integrate the ideation program into the mission-critical administrative processes.
Recommendation Six: Develop an easy-to-use mobile app for the ideation system.
Recommendation Seven: Keep learning from other agencies and even from commercial organizations.”

Findings from the emerging field of Transparency Research


Tiago Peixoto: “HEC Paris has just hosted the 3rd Global Conference on Transparency Research, and they have made the list of accepted papers available. …
As one goes through the papers,  it is clear that unlike most of the open government space, when it comes to research, transparency is treated less as a matter of technology and formats and more as a matter of social and political institutions.  And that is a good thing.”
This year’s papers are listed below:

Open government and conflicts with public trust and privacy: Recent research ideas


Article by John Wihbey:  “Since the Progressive Era, ideas about the benefits of government openness — crystallized by Justice Brandeis’s famous phrase about the disinfectant qualities of “sunlight” — have steadily grown more popular and prevalent. Post-Watergate reforms further embodied these ideas. Now, notions of “open government” and dramatically heightened levels of transparency have taken hold as zero-cost digital dissemination has become a reality. Many have advocated switching the “default” of government institutions so information and data are no longer available just “on demand” but rather are publicized as a matter of course in usable digital form.
As academic researchers point out, we don’t yet have a great deal of long-term, valid data for many of the experiments in this area to weigh civic outcomes and the overall advance of democracy. Anecdotally, though, it seems that more problems — from potholes to corruption — are being surfaced, enabling greater accountability. This “new fuel” of data also creates opportunities for businesses and organizations; and so-called “Big Data” projects frequently rely on large government datasets, as do “news apps.”
But are there other logical limits to open government in the digital age? If so, what are the rationales for these limits? And what are the latest academic insights in this area?
Most open-records laws, including the federal Freedom of Information Act, still provide exceptions that allow public institutions to guard information that might interfere with pending legal proceedings or jeopardize national security. In addition, the internal decision-making and deliberation processes of government agencies as well as documents related to personnel matters are frequently off limits. These exceptions remain largely untouched in the digital age (notwithstanding extralegal actions by WikiLeaks and Edward Snowden, or confidential sources who disclose things to the press). At a practical level, experts say that the functioning of FOIA laws is still uneven, and some states continue to threaten rollbacks.
Limits of transparency?
A key moment in the rethinking of openness came in 2009, when Harvard University legal scholar Lawrence Lessig published an essay in The New Republic titled “Against Transparency.” In it, Lessig — a well-known advocate for greater access to information and knowledge of many kinds — warned that transparency in and of itself could lead to diminished trust in government and must be tied to policies that can also rebuild public confidence in democratic institutions.
In recent years, more political groups have begun leveraging open records laws as a kind of tool to go after opponents, a phenomenon that has even touched the public university community, which is typically subject to disclosure laws….

Privacy and openness
If there is a tension between transparency and public trust, there is also an uneasy balance between government accountability and privacy. A 2013 paper in the American Review of Public Administration, “Public Pay Disclosure in State Government: An Ethical Analysis,” examines a standard question of disclosure faced in every state: How much should even low-level public servants be subject to personal scrutiny about their salaries? The researchers, James S. Bowman and Kelly A. Stevens of Florida State University, evaluate issues of transparency based on three competing values: rules (justice or fairness), results (what does the greatest good), and virtue (promoting integrity.)…”

Mozilla Location Service: crowdsourcing data to help devices find your location without GPS


“The Mozilla Location Service is an experimental pilot project to provide geolocation lookups based on publicly observable cell tower and WiFi access point information. Currently in its early stages, it already provides basic service coverage of select locations thanks to our early adopters and contributors.
A world map showing areas with location data. Map data provided by mapbox / OpenStreetMap.
While many commercial services exist in this space, there’s currently no large public service to provide this crucial part of any mobile ecosystem. Mobile phones with a weak GPS signal and laptops without GPS hardware can use this service to quickly identify their approximate location. Even though the underlying data is based on publicly accessible signals, geolocation data is by its very nature personal and privacy sensitive. Mozilla is committed to improving the privacy aspects for all participants of this service offering.
If you want to help us build our service, you can install our dedicated Android MozStumbler and enjoy competing against others on our leaderboard or choose to contribute anonymously. The service is evolving rapidly, so expect to see a more full featured experience soon. For an overview of the current experience, you can head over to the blog of Soledad Penadés, who wrote a far better introduction than we did.
We welcome any ideas or concerns about this project and would love to hear any feedback or experience you might have. Please contact us either on our dedicated mailing list or come talk to us in our IRC room #geo on Mozilla’s IRC server.
For more information please follow the links on our project page.”

When Nudges Fail: Slippery Defaults


New paper by Lauren E. Willis “Inspired by the success of “automatic enrollment” in increasing participation in defined contribution retirement savings plans, policymakers have put similar policy defaults in place in a variety of other contexts, from checking account overdraft coverage to home-mortgage escrows. Internet privacy appears poised to be the next arena. But how broadly applicable are the results obtained in the retirement savings context? Evidence from other contexts indicates two problems with this approach: the defaults put in place by the law are not always sticky, and the people who opt out may be those who would benefit the most from the default. Examining the new default for consumer checking account overdraft coverage reveals that firms can systematically undermine each of the mechanisms that might otherwise operate to make defaults sticky. Comparing the retirement-savings default to the overdraft default, four boundary conditions on the use of defaults as a policy tool are apparent: policy defaults will not be sticky when (1) motivated firms oppose them, (2) these firms have access to the consumer, (3) consumers find the decision environment confusing, and (4) consumer preferences are uncertain. Due to constitutional and institutional constraints, government regulation of the libertarian-paternalism variety is unlikely to be capable of overcoming these bounds. Therefore, policy defaults intended to protect individuals when firms have the motivation and means to move consumers out of the default are unlikely to be effective unless accompanied by substantive regulation. Moreover, the same is likely to be true of “nudges” more generally, when motivated firms oppose them.”

Selected Readings on Linked Data and the Semantic Web


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of linked data and the semantic web was originally published in 2013.

Linked Data and the Semantic Web movement are seeking to make our growing body of digital knowledge and information more interconnected, searchable, machine-readable and useful. First introduced by the W3C, Sir Tim Berners-Lee, Christian Bizer and Tom Heath define Linked Data as “data published to the Web in such a way that it is machine-readable, its meaning is explicitly defined, it is linked to other external data sets, and can in turn be linked to from external datasets.” In other words, Linked Data and the Semantic Web seek to do for data what the Web did for documents. Additionally, the evolving capability of linking together different forms of data is fueling the potentially transformative rise of social machines – “processes in which the people do the creative work and the machine does the administration.”

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Alani, Harith, David Dupplaw, John Sheridan, Kieron O’Hara, John Darlington, Nigel Shadbolt, and Carol Tullo. “Unlocking the Potential of Public Sector Information with Semantic Web Technology,” 2007. http://bit.ly/17fMbCt.

  • This paper explores the potential of using Semantic Web technology to increase the value of public sector information already in existence.
  • The authors note that, while “[g]overnments often hold very rich data and whilst much of this information is published and available for re-use by others, it is often trapped by poor data structures, locked up in legacy data formats or in fragmented databases. One of the great benefits that Semantic Web (SW) technology offers is facilitating the large scale integration and sharing of distributed data sources.”
  • They also argue that Linked Data and the Semantic Web are growing in use and visibility in other sectors, but government has been slower to adapt: “The adoption of Semantic Web technology to allow for more efficient use of data in order to add value is becoming more common where efficiency and value-added are important parameters, for example in business and science. However, in the field of government there are other parameters to be taken into account (e.g. confidentiality), and the cost-benefit analysis is more complex.” In spite of that complexity, the authors’ work “was intended to show that SW technology could be valuable in the governmental context.”

Berners-Lee, Tim, James Hendler, and Ora Lassila. “The Semantic Web.” Scientific American 284, no. 5 (2001): 28–37. http://bit.ly/Hhp9AZ.

  • In this article, Sir Tim Berners-Lee, James Hendler and Ora Lassila introduce the Semantic Web, “a new form of Web content that is meaningful to computers [and] will unleash a revolution of new possibilities.”
  • The authors argue that the evolution of linked data and the Semantic Web “lets anyone express new concepts that they invent with minimal effort. Its unifying logical language will enable these concepts to be progressively linked into a universal Web. This structure will open up the knowledge and workings of humankind to meaningful analysis by software agents, providing a new class of tools by which we can live, work and learn together.”

Bizer, Christian, Tom Heath, and Tim Berners-Lee. “Linked Data – The Story So Far.” International Journal on Semantic Web and Information Systems (IJSWIS) 5, no. 3 (2009): 1–22. http://bit.ly/HedpPO.

  • In this paper, the authors take stock of Linked Data’s challenges, potential and successes close to a decade after its introduction. They build their argument for increasingly linked data by referring to the incredible value creation of the Web: “Despite the inarguable benefits the Web provides, until recently the same principles that enabled the Web of documents to flourish have not been applied to data.”
  • The authors expect that “Linked Data will enable a significant evolutionary step in leading the Web to its full potential” if a number of research challenges can be adequately addressed, both technical, like interaction paradigms and data fusion; and non-technical, like licensing, quality and privacy.

Ding, Li, Dominic Difranzo, Sarah Magidson, Deborah L. Mcguinness, and Jim Hendler. Data-Gov Wiki: Towards Linked Government Data, n.d. http://bit.ly/1h3ATHz.

  • In this paper, the authors “investigate the role of Semantic Web technologies in converting, enhancing and using linked government data” in the context of Data-gov Wiki, a project that attempts to integrate datasets found at Data.gov into the Linking Open Data (LOD) cloud.
  • The paper features discussion and “practical strategies” based on four key issue areas: Making Government Data Linkable, Linking Government Data, Supporting the Use of Linked Government Data and Preserving Knowledge Provenance.

Kalampokis, Evangelos, Michael Hausenblas, and Konstantinos Tarabanis. “Combining Social and Government Open Data for Participatory Decision-Making.” In Electronic Participation, edited by Efthimios Tambouris, Ann Macintosh, and Hans de Bruijn, 36–47. Lecture Notes in Computer Science 6847. Springer Berlin Heidelberg, 2011. http://bit.ly/17hsj4a.

  • This paper presents a proposed data architecture for “supporting participatory decision-making based on the integration and analysis of social and government data.” The authors believe that their approach will “(i) allow decision makers to understand and predict public opinion and reaction about specific decisions; and (ii) enable citizens to inadvertently contribute in decision-making.”
  • The proposed approach, “based on the use of the linked data paradigm,” draws on subjective social data and objective government data in two phases: Data Collection and Filtering and Data Analysis. “The aim of the former phase is to narrow social data based on criteria such as the topic of the decision and the target group that is affected by the decision. The aim of the latter phase is to predict public opinion and reactions using independent variables related to both subjective social and objective government data.”

Rady, Kaiser. Publishing the Public Sector Legal Information in the Era of the Semantic Web. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, 2012. http://bit.ly/17fMiOp.

  • Following an EU directive calling for the release of public sector information by member states, this study examines the “uniqueness” of creating and publishing primary legal source documents on the web and highlights “the most recent technological strategy used to structure, link and publish data online (the Semantic Web).”
  • Rady argues for public sector legal information to be published as “open-linked-data in line with the new approach for the web.” He believes that if data is created and published in this form, “the data will be more independent from devices and applications and could be considered as a component of [a] big information system. That because, it will be well-structured, classified and has the ability to be used and utilized in various combinations to satisfy specific user requirements.”

Shadbolt, Nigel, Kieron O’Hara, Tim Berners-Lee, Nicholas Gibbins, Hugh Glaser, Wendy Hall, and m.c. schraefel. “Linked Open Government Data: Lessons from Data.gov.uk.” IEEE Intelligent Systems 27, no. 3 (May 2012): 16–24. http://bit.ly/1cgdH6R.

  • In this paper, the authors view Open Government Data (OGD) as an “opportunity and a challenge for the LDW [Linked Data Web]. The opportunity is to grow by linking with PSI [Public Sector Information] – real-world, useful information with good provenance. The challenge is to manage the sudden influx of heterogeneous data, often with minimal semantics and structure, tailored to highly specific task contexts.
  • As the linking of OGD continues, the authors argue that, “Releasing OGD is not solely a technical problem, although it presents technical challenges. OGD is not a rigid government IT specification, but it demands productive dialogue between data providers, users, and developers. We should expect a ‘perpetual beta,’ in which best practice, technical development, innovative use of data, and citizen-centric politics combine to drive data-release programs.”
  • Despite challenges, the authors believe that, “Integrating OGD onto the LDW will vastly increase the scope and richness of the LDW. A reciprocal benefit is that the LDW will provide additional resources and context to enrich OGD. Here, we see the network effect in action, with resources mutually adding value to one another.”

Vitale, Michael, Anni Rowland-Campbell, Valentina Cardo, and Peter Thompson. “The Implications of Government as a ‘Social Machine’ for Making and Implementing Market-based Policy.” Intersticia, September 2013. http://bit.ly/HhMzqD.

  • This report from the Australia and New Zealand School of Government (ANZSOG) explores the concept of government as a social machine. The authors draw on the definition of a social machine proposed by Sir Nigel Shadbolt et al. – a system where “human and computational intelligence coalesce in order to achieve a given purpose” – to describe a “new approach to the relationship between citizens and government, facilitated by technological systems which are increasingly becoming intuitive, intelligent and ‘social.'”
  • The authors argue that beyond providing more and varied data to government, the evolving concept of government as a social machine as the potential to alter power dynamics, address the growing lack of trust in public institutions and facilitate greater public involvement in policy-making.

Big Data


Special Report on Big Data by Volta – A newsletter on Science, Technology and Society in Europe:  “Locating crime spots, or the next outbreak of a contagious disease, Big Data promises benefits for society as well as business. But more means messier. Do policy-makers know how to use this scale of data-driven decision-making in an effective way for their citizens and ensure their privacy?90% of the world’s data have been created in the last two years. Every minute, more than 100 million new emails are created, 72 hours of new video are uploaded to YouTube and Google processes more than 2 million searches. Nowadays, almost everyone walks around with a small computer in their pocket, uses the internet on a daily basis and shares photos and information with their friends, family and networks. The digital exhaust we leave behind every day contributes to an enormous amount of data produced, and at the same time leaves electronic traces that contain a great deal of personal information….
Until recently, traditional technology and analysis techniques have not been able to handle this quantity and type of data. But recent technological developments have enabled us to collect, store and process data in new ways. There seems to be no limitations, either to the volume of data or technology for storing and analyzing them. Big Data can map a driver’s sitting position to identify a car thief, it can use Google searches to predict outbreaks of the H1N1 flu virus, it can data-mine Twitter to predict the price of rice or use mobile phone top-ups to describe unemployment in Asia.
The word ‘data’ means ‘given’ in Latin. It commonly refers to a description of something that can be recorded and analyzed. While there is no clear definition of the concept of ‘Big Data’, it usually refers to the processing of huge amounts and new types of data that have not been possible with traditional tools.

‘The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way.’

The notion of Big Data is kind of misleading, argues Robindra Prabhu, a project manager at the Norwegian Board of Technology. “The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way. The digitalization of society gives us access to both ‘traditional’, structured data – like the content of a database or register – and unstructured data, for example the content in a text, pictures and videos. Information designed to be read by humans is now also readable by machines. And this development makes a whole new world of  data gathering and analysis available. Big Data is exciting not just because of the amount and variety of data out there, but that we can process data about so much more than before.”