Paper by Mara Maretti, Vanessa Russo & Emiliano del Gobbo: “The expression ‘open data’ relates to a system of informative and freely accessible databases that public administrations make generally available online in order to develop an informative network between institutions, enterprises and citizens. On this topic, using the semantic network analysis method, the research aims to investigate the communication structure and the governance of open data in the Twitter conversational environment. In particular, the research questions are: (1) Who are the main actors in the Italian open data infrastructure? (2) What are the main conversation topics online? (3) What are the pros and cons of the development and use (reuse) of open data in Italy? To answer these questions, we went through three research phases: (1) analysing the communication network, we found who are the main influencers; (2) once we found who were the main actors, we analysed the online content in the Twittersphere to detect the semantic areas; (3) then, through an online focus group with the main open data influencers, we explored the characteristics of Italian open data governance. Through the research, it has been shown that: (1) there is an Italian open data governance strategy; (2) the Italian civic hacker community plays an important role as an influencer; but (3) there are weaknesses in governance and in practical reuse….(More)”.
Introducing the Institute of Impossible Ideas
Blog by Dominic Campbell: “…We have an opportunity ahead of us to set up a new model which seeds and keeps innovation firmly in the public realm. Using entrepreneurial approaches, we can work together to not only deliver better outcomes for citizens for less but ideate, create and build technology-driven, sustainable services that remain in public hands.
Rebooting public services for the 21st century
Conventional wisdom is that the private sector is best placed to drive radical change with its ecosystem of funders, appetite for risk and perceived ability to attract the best and brightest minds. In the private sector, digital companies have disrupted whole industries. Tech startups are usurping the incumbents, improving experiences and reducing costs before expanding and completely transforming the landscape around them.
We’re talking about the likes of Netflix who started a new model for movie rentals, turned streaming platform for TV and is now one of the world’s largest producers of media. Or Airbnb, which got its start renting a spare room and air mattress, turned one of the largest travel booking platforms and is now moving into building physical hotels and housing. Two organisations who saw an opportunity in a market, and have gone on to reinvent a full-stack service.
The entrepreneurial approach has driven rapid innovation in some fields, but private sector outsourcing for the public realm has rarely led to truly radical innovation. That doesn’t stop the practice, and profits remain in private hands. Old models of innovation, either internal and incremental or left to the private sector, aren’t working.
The public sector can, and does, drive innovation. And yet, we continue to see private profits take off from the runway of publicly funded innovation, the state receiving little of the financial reward for the private sector’s increased role in public service delivery….(More)…Find out more about the Institute of Impossible Ideas.
Demystifying the Role of Data Interoperability in the Access and Sharing Debate
Paper by Jörg Hoffmann and Begoña Gonzalez Otero: “In the current data access and sharing debate, data interoperability is widely proclaimed as being key for efficiently reaping the economic welfare enhancing effects of further data re-use. Although, we agree, we found that the current law and policy framework pertaining data interoperability was missing a groundworks analysis. Without a clear understanding of the notions of interoperability, the role of data standards and application programming interfaces (APIs) to achieve this ambition, and the IP and trade secrets protection potentially hindering it, any regulatory analysis within the data access discussion will be incomplete. Any attempt at untangling the role of data interoperability in the access and sharing regimes requires a thorough understanding of the underlying technology and a common understanding of the different notions of data interoperability.
The paper firstly explains the technical complexity of interoperability and its enablers, namely data standards and application programming interfaces. It elaborates on the reasons data interoperability counts with different levels and puts emphasis on the fact that data interoperability is indirectly tangled to the data access right. Since data interoperability may be part of the legal obligations correlating to the access right, the scope of interoperability is and has already been subject to courts’ interpretation. While this may give some manoeuvre for balanced decision-making, it may not guarantee the ambition of efficient re-usability of data. This is why data governance market regulation under a public law approach is becoming more favourable. Yet, and this is elaborated in a second step, the paper builds on the assumption that interoperability should not become another policy on its own. This is followed by a competition economics assessment, taking into account that data interoperability is always a matter of degree and a lack of data interoperability does not necessarily lead to a market foreclosure of competitors and to causing harm to consumer welfare. Additionally, parts of application programming interfaces (APIs) may be protected under IP rights and trade secrets, which might conflict with data access rights. Instead of further solving the conflicting regimes within the respective legal regimes of the exclusive rights the paper concludes by suggesting that (sector-specific) data governance solutions should deal with this issue and align the different interests implied. This may provide for better, practical and well-balanced solutions instead of impractical and dysfunctional exceptions and limitations within the IP and trade secrets regimes….(More)”.
The Expertise Curse: How Policy Expertise Can Hinder Responsiveness
Report by Miguel Pereira and Patrik Öhberg: “We argue that policy expertise may constrain the ability of politicians to be responsive. Legislators with more knowledge and experience in a given policy area have more confidence in their own issue-specific positions. Enhanced confidence, in turn, may lead legislators to discount opinions they disagree with. Two experiments with Swedish politicians support our argument. First, we find that officials with more expertise in a given domain are more likely to dismiss appeals from voters who hold contrasting opinions, regardless of their specific position on the policy, and less likely to accept that opposing views may represent the majority opinion. Consistent with the proposed mechanism, in a second experiment we show that inducing perceptions of expertise increases self-confidence. The results suggest that representatives with more expertise in a given area are paradoxically less capable of voicing public preferences in that domain. The study provides a novel explanation for distortions in policy responsiveness….(More)”
If data is 21st century oil, could foundations be the right owners?
Felix Oldenburg at Alliance: “What are the best investments for a foundation? This important question is one many foundation professionals are revisiting in light of low interest rates, high market volatility, and fears of deep economic trouble ahead. While stories of success certainly exist and are worth learning from, even the notorious lack of data cannot obscure the inconvenient truth that the idea of traditional endowments is in trouble.
I would argue that in order to unleash the potential of foundations, we should turn the question around, perhaps back on its feet: For which assets are foundations the best owners?
In the still dawning digital age, one fascinating answer may stare you right in the face as you read this. How much is your personal data worth? Your social media information, search and purchase history, they are the source of much of the market value of the fastest growing sector of our time. A rough estimate of market valuation of the major social platforms divided by their active users arrives at more than $1,000 USD per user, not differentiating by location or other factors. This sum is more than the median per capita wealth in about half the world’s countries. And if the trend continues, this value may continue to grow – and with it the big question of how to put one of the most valuable resource of our time to use for the good of all.
Acting as guardians of digital commons, data-endowed foundations could negotiate conditions for the commercial use of its assets, and invest the income to create equal digital opportunities, power 21st century education, and fight climate change.
Foundation ownership in the data sector may sound like a wild idea at first. Yet foundations and their predecessors have played the role of purpose-driven owners of critical assets and infrastructures throughout history. Monasteries (called ‘Stifte’ in German, the root of the German word for foundations) have protected knowledge and education in libraries, and secured health care in hospitals. Trusts have created affordable much of the social housing in the exploding cities of the 19th century. The German Marshall Plan created an endowment for economic recovery that is still in existence today.
The proposition is simple: Independent ownership for the good of all, beyond the commercial or national interests of individual corporations of governments, in perpetuity. Acting as guardians of digital commons, data-endowed foundations could negotiate conditions for the commercial use of its assets, and invest the income to create equal digital opportunities, power 21st century education, and fight climate change. An ideal model of ownership would also include a form of governance exercised by the users themselves through digital participation and elections. A foundation really only relies on one thing, a stable frame of rights in its legal home country. This is far from a trivial condition, but again history shows how many foundations have survived depressions, wars, and revolutions….(More)”
UK passport photo checker shows bias against dark-skinned women
Maryam Ahmed at BBC News: “Women with darker skin are more than twice as likely to be told their photos fail UK passport rules when they submit them online than lighter-skinned men, according to a BBC investigation.
One black student said she was wrongly told her mouth looked open each time she uploaded five different photos to the government website.
This shows how “systemic racism” can spread, Elaine Owusu said.
The Home Office said the tool helped users get their passports more quickly.
“The indicative check [helps] our customers to submit a photo that is right the first time,” said a spokeswoman.
“Over nine million people have used this service and our systems are improving.
“We will continue to develop and evaluate our systems with the objective of making applying for a passport as simple as possible for all.”
Skin colour
The passport application website uses an automated check to detect poor quality photos which do not meet Home Office rules. These include having a neutral expression, a closed mouth and looking straight at the camera.
BBC research found this check to be less accurate on darker-skinned people.
More than 1,000 photographs of politicians from across the world were fed into the online checker.
The results indicated:
- Dark-skinned women are told their photos are poor quality 22% of the time, while the figure for light-skinned women is 14%
- Dark-skinned men are told their photos are poor quality 15% of the time, while the figure for light-skinned men is 9%
Photos of women with the darkest skin were four times more likely to be graded poor quality, than women with the lightest skin….(More)”.
How Not to Kill People With Spreadsheets
David Gerard at Foreign Policy: “The U.K.’s response to COVID-19 is widely regarded as scattershot and haphazard. So how did they get here?
Excel is a top-of-the-line spreadsheet tool. A spreadsheet is good for quickly modeling a problem—but too often, organizations cut corners and press the cardboard-and-string mock-up into production, instead of building a robust and unique system based on the Excel proof of concept.
Excel is almost universally misused for complex data processing, as in this case—because it’s already present on your work computer and you don’t have to spend months procuring new software. So almost every business has at least one critical process that relies on a years-old spreadsheet set up by past staff members that nobody left at the company understands.
That’s how the U.K. went wrong. An automated process at Public Health England (PHE) transformed the incoming private laboratory test data (which was in text-based CSV files) into Excel-format files, to pass to the Serco Test and Trace teams’ dashboards.
Unfortunately, the process produced XLS files—an outdated Excel format that went extinct in 2003—which had a limit of 65,536 rows, rather than the around 1 million-row limit in the more recent XLSX format. With several lines of data per patient, this meant a sheet could only hold 1,400 cases. Further cases just fell off the end.
Technicians at PHE monitoring the dashboards noticed on Oct. 2 that not all data that had been sent in was making it out the other end. The data was corrected the next day, and PHE announced the issue the day after.
It’s not clear if the software at PHE was an Excel spreadsheet or an in-house program using the XLS format for data interchange—the latter would explain why PHE stated that replacing it might take months—but the XLS format would have been used on the assumption that Excel was universal.
And even then, a system based on Excel-format files would have been an improvement over earlier systems—the system for keeping a count of COVID-19 cases in the U.K. was, as of May, still based on data handwritten on cards….
The process that went wrong was a workaround for a contract issue: The government’s contract with Deloitte to run the testing explicitly stipulated that the company did not have to report “Pillar 2” (general public testing) positive cases to PHE at all.
Since a test-and-trace system is not possible without this data, PHE set up feeds for the data anyway, as CSV text files directly from the testing labs. The data was then put into this system—the single system that serves as the bridge between testing and tracing, for all of England. PHE had to put in place technological duct tape to make a system of life-or-death importance work at all….
The Brookings Institution report Doomed: Challenges and solutions to government IT projects lists factors to consider when outsourcing government information technology. The outsourcing of tracking and tracing is an example where the government has assumed all of the risk, and the contractor assumes all of the profit. PHE did one thing that you should never do: It outsourced a core function. Running a call center or the office canteen? You can outsource it. Tracing a pandemic? You must run it in-house.
If you need outside expertise for a core function, use contractors working within a department. Competing with the private sector on pay can be an issue, but a meaningful project can be a powerful incentive….(More)”.
Scotland’s future vision discussed today in first Citizens’ Assembly
Article by Richard Mason: “The group of 100 broadly representative Scots have been meeting throughout the year to discuss some of the country’s major constitutional issues.
Members have been asked to consider three questions, the first of which is: “What kind of country are we seeking to build?”
The assembly will meet online to develop the vision, having examined issues such as finances and taxation, and discussed how decisions are taken for and about Scotland. A report of the meeting will be published on October 9
The other two parts of the Assembly’s remit – how to best overcome the challenges the country faces, including Brexit, and how to empower people to make “informed choices” about Scotland’s future – will be addressed in a final report by the end of the year.
Assembly convener Kate Wimpress said: “The meeting this weekend will see a group of people from all walks of life across Scotland come together to agree a shared vision of our country’s future.
“The Citizens’ Assembly’s vision for Scotland will help give a roadmap for the country at an uncertain and difficult time.
“Our members have worked hard together across the months, and it’s exciting to witness their efforts now coming to fruition.”
First Minister Nicola Sturgeon announced the creation of the Citizens’ Assembly and outlined its remit, but she stressed it would be independent from Government following criticism it was set up to garner independence support.
Constitution Secretary Michael Russell said the Scottish Government is spending £1.37 million to fund six assembly meetings, which were held in person before moving online following the coronavirus lockdown….(More)”
Transparency and Secrecy in European Democracies: Contested Trade-offs
Book edited by Dorota Mokrosinska: This edited volume offers a critical discussion of the trade-offs between transparency and secrecy in the actual political practice of democratic states in Europe. As such, it answers to a growing need to systematically analyse the problem of secrecy in governance in this political and geographical context.
Focusing on topical cases and controversies in particular areas, the contributors reflect on the justification and limits of the use of secrecy in democratic governance, register the social, cultural, and historical factors that inform this process and explore the criteria used by European legislators and policy-makers, both at the national and supranational level, when balancing interests on the sides of transparency and secrecy, respectively.
This book will be of key interest to scholars and students of security studies, political science, European politics/studies, law, history, political philosophy, public administration, intelligence studies, media and communication studies, and information technology sciences….(More)”.
How to fix the GDPR’s frustration of global biomedical research
Jasper Bovenberg, David Peloquin, Barbara Bierer, Mark Barnes, and Bartha Maria Knoppers at Science: “Since the advent of the European Union (EU) General Data Protection Regulation (GDPR) in 2018, the biomedical research community has struggled to share data with colleagues and consortia outside the EU, as the GDPR limits international transfers of personal data. A July 2020 ruling of the Court of Justice of the European Union (CJEU) reinforced obstacles to sharing, and even data transfer to enable essential research into coronavirus disease 2019 (COVID-19) has been restricted in a recent Guidance of the European Data Protection Board (EDPB). We acknowledge the valid concerns that gave rise to the GDPR, but we are concerned that the GDPR’s limitations on data transfers will hamper science globally in general and biomedical science in particular (see the text box) (1)—even though one stated objective of the GDPR is that processing of personal data should serve humankind, and even though the GDPR explicitly acknowledges that the right to the protection of personal data is not absolute and must be considered in relation to its function in society and be balanced against other fundamental rights. We examine whether there is room under the GDPR for EU biomedical researchers to share data from the EU with the rest of the world to facilitate biomedical research. We then propose solutions for consideration by either the EU legislature, the EU Commission, or the EDPB in its planned Guidance on the processing of health data for scientific research. Finally, we urge the EDPB to revisit its recent Guidance on COVID-19 research….(More)“.