CRS Reports & Analysis: “Since the rise of social media over the past decade, new platforms of technology have reinforced the adage that the law lags behind developments in technology. Government agencies, officials, and employees regularly use a number of social media options – e.g., Twitter, Facebook, etc. – that have led agencies to update existing ethics rules to reflect the unique issues that they may present. Two areas of ethics regulation affected by the increased role of social media are the ethical standards governing gifts to federal employees and the restrictions on employees’ political activities. These rules apply to employees in the executive branch, though separate ethics rules and guidance on similar topics apply to the House and Senate….(More)”
Fairness in Machine Learning
Presentation by Delip Rao: “…The models you create have power to get people arrested or vindicated, get loans approved or rejected, determine what interest rate should be charged for such loans, who should be shown to you in your long list of pursuits on your Tinder, what news do you read, who gets called for a job phone screen or even a college admission… the list goes on.
So what can you do about it?…
Data protection laws around the world
Fifth edition Handbook by DLA Piper’s Data Protection and Privacy practice: “More than ever it is crucial that organisations manage and safeguard personal information and address their risks and legal responsibilities in relation to processing personal data, to address the growing thicket of applicable data protection legislation.
A well‑constructed and comprehensive compliance program can solve these competing interests and is an important risk‑management tool.
This handbook sets out an overview of the key privacy and data protection laws and regulations across nearly 100 different jurisdictions and offers a primer to businesses as they consider this complex and increasingly important area of compliance….(More)”
A ‘design-thinking’ approach to governing the future
Bronwyn van der Merwe at The Hill: “…Government organizations are starting to realize the benefits of digital transformation to reinvent the citizen experience in the form of digital services tailored to individual needs. However, public service leaders are finding that as they move further into the digital age, they need to re-orient their internal organizations around this paradigm shift, or their investments in digital are likely to fail. This is where Design Thinking comes into play.
Design Thinking has become a proven approach to reimagining complex service or organizational issues in the private sector. This approach of user research, rapid prototyping, constant feedback and experimentation is starting to take hold in leading business, like Citrix Systems, Ebay and Google, and is slowly spilling over into government bodies.
Challenges to Adopting a Design-Led Approach
Success in implementing Design Thinking depends on disrupting embedded organizational beliefs and practices, including cultural shifts, changing attitudes toward risk and failure, and encouraging openness and collaboration. Specifically, government bodies need to consider:
- Top to bottom support – any change as wide-ranging as the shift to Design Thinking requires support from the top. Those at the top of design-led organizations need to be experimenters, improvisers and networkers who lead by example and set the tone for change on the ground.
- Design skills gap – talent to execute innovation is in short supply and few governments are in a financial position to outbid private sector firms on pay. But the public sector does have something to offer that private companies most often do not: the ability to do meaningful work for the public good. Public sector bodies also need to upskill their current employees – at times partnering with outside design experts.
- No risk, no reward – for government agencies, it can be challenging to embrace a culture of trial and error. But Design Thinking is useless without Design Doing. Agencies need to recognize the benefits of agile prototyping, iterating and optimizing processes, and that failings early on can save millions while costing little.
What Can Government Bodies Do to Change?
Digital has paved the way for governments and the private sector to occasionally partner to solve thorny challenges. For instance, the White House brought together the U.N. Refugee Agency and crowdfunding platform Kickstarter to raise money for the Syrian relief effort. The weeklong partnership raised nearly $1.8 million for more than 7,000 people in need.
But to effectively communicate with today’s digitally-enabled citizens, there are several key principals government bodies must follow:
- Plain and simple – use simple language focused on content, structure, navigation, grouping and completion. Strip away the bureaucratic, government-speak and be transparent.
- Take an outside-in design approach – by considering the entire ecosystem, and using research to uncover insights, service design reveals an outside-in view of the people in the entire ecosystem.
- Be sensitive – too many government services, tools and processes are opaque and cumbersome when dealing with sensitive issues, such as immigration, making a tax submission, or adopting a child. Fjord recently took a human-centered design framework to the State of Michigan by designing a system that allowed caseworkers to convey the fairness of a child support order, while delivering excellent customer service and increasing transparency and accuracy to families in the midst of an emotionally-charged separation.
- Work to digitize processes and services across departments – Governments should look to organize their digital services around the needs of the people – whether they are starting a business, retiring or having a child – rather than around their own departmental structures.
- Address privacy concerns – The assurance of privacy and security is a critical step to encourage adoption of digital channels….(More)”
Open Data and Beyond
Paper by Frederika Welle Donker, Bastiaan van Loenen and Arnold K. Bregt: “In recent years, there has been an increasing trend of releasing public sector information as open data. Governments worldwide see the potential benefits of opening up their data. The potential benefits are more transparency, increased governmental efficiency and effectiveness, and external benefits, including societal and economic benefits. The private sector also recognizes potential benefits of making their datasets available as open data. One such company is Liander, an energy network administrator in the Netherlands. Liander views open data as a contributing factor to energy conservation. However, to date there has been little research done into the actual effects of open data. This research has developed a monitoring framework to assess the effects of open data, and has applied the framework to Liander’s small-scale energy consumption dataset….(More)“
OpenTrials: towards a collaborative open database of all available information on all clinical trials
Paper Ben Goldacre and Jonathan Gray at BioMed Central: “OpenTrials is a collaborative and open database for all available structured data and documents on all clinical trials, threaded together by individual trial. With a versatile and expandable data schema, it is initially designed to host and match the following documents and data for each trial: registry entries; links, abstracts, or texts of academic journal papers; portions of regulatory documents describing individual trials; structured data on methods and results extracted by systematic reviewers or other researchers; clinical study reports; and additional documents such as blank consent forms, blank case report forms, and protocols. The intention is to create an open, freely re-usable index of all such information and to increase discoverability, facilitate research, identify inconsistent data, enable audits on the availability and completeness of this information, support advocacy for better data and drive up standards around open data in evidence-based medicine. The project has phase I funding. This will allow us to create a practical data schema and populate the database initially through web-scraping, basic record linkage techniques, crowd-sourced curation around selected drug areas, and import of existing sources of structured and documents. It will also allow us to create user-friendly web interfaces onto the data and conduct user engagement workshops to optimise the database and interface designs. Where other projects have set out to manually and perfectly curate a narrow range of information on a smaller number of trials, we aim to use a broader range of techniques and attempt to match a very large quantity of information on all trials. We are currently seeking feedback and additional sources of structured data….(More)”
How Big Data Harms Poor Communities
Kaveh Waddell in the Atlantic: “Big data can help solve problems that are too big for one person to wrap their head around. It’s helped businesses cut costs, cities plan new developments, intelligence agencies discover connections between terrorists, health officials predict outbreaks, and police forces get ahead of crime. Decision-makers are increasingly told to “listen to the data,” and make choices informed by the outputs of complex algorithms.
But when the data is about humans—especially those who lack a strong voice—those algorithms can become oppressive rather than liberating. For many poor people in the U.S., the data that’s gathered about them at every turn can obstruct attempts to escape poverty.
Low-income communities are among the most surveilled communities in America. And it’s not just the police that are watching, says Michele Gilman, a law professor at the University of Baltimore and a former civil-rights attorney at the Department of Justice. Public-benefits programs, child-welfare systems, and monitoring programs for domestic-abuse offenders all gather large amounts of data on their users, who are disproportionately poor.
In certain places, in order to qualify for public benefits like food stamps, applicants have to undergo fingerprinting and drug testing. Once people start receiving the benefits, officials regularly monitor them to see how they spend the money, and sometimes check in on them in their homes.
Data gathered from those sources can end up feeding back into police systems, leading to a cycle of surveillance. “It becomes part of these big-data information flows that most people aren’t aware they’re captured in, but that can have really concrete impacts on opportunities,” Gilman says.
Once an arrest crops up on a person’s record, for example, it becomes much more difficult for that person to find a job, secure a loan, or rent a home. And that’s not necessarily because loan officers or hiring managers pass over applicants with arrest records—computer systems that whittle down tall stacks of resumes or loan applications will often weed some out based on run-ins with the police.
When big-data systems make predictions that cut people off from meaningful opportunities like these, they can violate the legal principle of presumed innocence, according to Ian Kerr, a professor and researcher of ethics, law, and technology at the University of Ottawa.
Outside the court system, “innocent until proven guilty” is upheld by people’s due-process rights, Kerr says: “A right to be heard, a right to participate in one’s hearing, a right to know what information is collected about me, and a right to challenge that information.” But when opaque data-driven decision-making takes over—what Kerr calls “algorithmic justice”—some of those rights begin to erode….(More)”
Innovation and Its Enemies: Why People Resist New Technologies
]Book by Calestous Juma: “The rise of artificial intelligence has rekindled a long-standing debate regarding the impact of technology on employment. This is just one of many areas where exponential advances in technology signal both hope and fear, leading to public controversy. This book shows that many debates over new technologies are framed in the context of risks to moral values, human health, and environmental safety. But it argues that behind these legitimate concerns often lie deeper, but unacknowledged, socioeconomic considerations. Technological tensions are often heightened by perceptions that the benefits of new technologies will accrue only to small sections of society while the risks will be more widely distributed. Similarly, innovations that threaten to alter cultural identities tend to generate intense social concern. As such, societies that exhibit great economic and political inequities are likely to experience heightened technological controversies.
Drawing from nearly 600 years of technology history, Innovation and Its Enemies identifies the tension between the need for innovation and the pressure to maintain continuity, social order, and stability as one of today’s biggest policy challenges. It reveals the extent to which modern technological controversies grow out of distrust in public and private institutions. Using detailed case studies of coffee, the printing press, margarine, farm mechanization, electricity, mechanical refrigeration, recorded music, transgenic crops, and transgenic animals, it shows how new technologies emerge, take root, and create new institutional ecologies that favor their establishment in the marketplace. The book uses these lessons from history to contextualize contemporary debates surrounding technologies such as artificial intelligence, online learning, 3D printing, gene editing, robotics, drones, and renewable energy. It ultimately makes the case for shifting greater responsibility to public leaders to work with scientists, engineers, and entrepreneurs to manage technological change, make associated institutional adjustments, and expand public engagement on scientific and technological matters….(More)”
Big Data in the Public Sector
Chapter by Ricard Munné in New Horizons for a Data-Driven Economy: “The public sector is becoming increasingly aware of the potential value to be gained from big data, as governments generate and collect vast quantities of data through their everyday activities.
The benefits of big data in the public sector can be grouped into three major areas, based on a classification of the types of benefits: advanced analytics, through automated algorithms; improvements in effectiveness, providing greater internal transparency; improvements in efficiency, where better services can be provided based on the personalization of services; and learning from the performance of such services.
The chapter examined several drivers and constraints that have been identified, which can boost or stop the development of big data in the sector depending on how they are addressed. The findings, after analysing the requirements and the technologies currently available, show that there are open research questions to be addressed in order to develop such technologies so competitive and effective solutions can be built. The main developments are required in the fields of scalability of data analysis, pattern discovery, and real-time applications. Also required are improvements in provenance for the sharing and integration of data from the public sector. It is also extremely important to provide integrated security and privacy mechanisms in big data applications, as public sector collects vast amounts of sensitive data. Finally, respecting the privacy of citizens is a mandatory obligation in the European Union….(More)”
Data and Humanitarian Response
The GovLab: “As part of an ongoing effort to build a knowledge base for the field of opening governance by organizing and disseminating its learnings, the GovLab Selected Readings series provides an annotated and curated collection of recommended works on key opening governance topics. In this edition, we explore the literature on Data and Humanitarian Response. To suggest additional readings on this or any other topic, please email [email protected]. All our Selected Readings can be found here.
Context
Data, when used well in a trusted manner , allows humanitarian organizations to innovate how to respond to emergency events, including better coordination of post-disaster relief efforts, the ability to harness local knowledge to create more targeted relief strategies, and tools to predict and monitor disasters in real time. Consequently, in recent years both multinational groups and community-based advocates have begun to integrate data collection and evaluation strategies into their humanitarian operations, to better and more quickly respond to emergencies. However, this movement poses a number of challenges. Compared to the private sector, humanitarian organizations are often less equipped to successfully analyze and manage big data, which pose a number of risks related to the security of victims’ data. Furthermore, complex power dynamics which exist within humanitarian spaces may be further exacerbated through the introduction of new technologies and big data collection mechanisms. In the below we share:
- Selected Reading List (summaries and hyperlinks)
- Annotated Selected Reading List
- Additional Readings….(More)”