The necessity of judgment


Essay by Jeff Malpas in AI and Society: “In 2016, the Australian Government launched an automated debt recovery system through Centrelink—its Department of Human Services. The system, which came to be known as ‘Robodebt’, matched the tax records of welfare recipients with their declared incomes as held by Ethe Department and then sent out debt notices to recipients demanding payment. The entire system was computerized, and many of those receiving debt notices complained that the demands for repayment they received were false or inaccurate as well as unreasonable—all the more so given that those being targeted were, almost by definition, those in already vulnerable circumstances. The system provoked enormous public outrage, was subjected to successful legal challenge, and after being declared unlawful, the Government paid back all of the payments that had been received, and eventually, after much prompting, issued an apology.

The Robodebt affair is characteristic of a more general tendency to shift to systems of automated decision-making across both the public and the private sector and to do so even when those systems are flawed and known to be so. On the face of it, this shift is driven by the belief that automated systems have the capacity to deliver greater efficiencies and economies—in the Robodebt case, to reduce costs by recouping and reducing social welfare payments. In fact, the shift is characteristic of a particular alliance between digital technology and a certain form of contemporary bureaucratised capitalism. In the case of the automated systems we see in governmental and corporate contexts—and in many large organisations—automation is a result both of the desire on the part of software, IT, and consultancy firms to increase their customer base as well as expand the scope of their products and sales, and of the desire on the part of governments and organisations to increase control at the same time as they reduce their reliance on human judgment and capacity. The fact is, such systems seldom deliver the efficiencies or economies they are assumed to bring, and they also give rise to significant additional costs in terms of their broader impact and consequences, but the imperatives of sales and seemingly increased control (as well as an irrational belief in the benefits of technological solutions) over-ride any other consideration. The turn towards automated systems like Robodebt is, as is now widely recognised, a common feature of contemporary society. To look to a completely different domain, new military technologies are being developed to provide drone weapon systems with the capacity to identify potential threats and defend themselves against them. The development is spawning a whole new field of military ethics-based entirely around the putative ‘right to self-defence’ of automated weapon systems.

In both cases, the drone weapon system and Robodebt, we have instances of the development of automated systems that seem to allow for a form of ‘judgment’ that appears to operate independently of human judgment—hence the emphasis on this systems as autonomous. One might argue—and typically it is so argued—that any flaws that such systems currently present can be overcome either through the provision of more accurate information or through the development of more complex forms of artificial intelligence….(More)”.

Governing in a pandemic: from parliamentary sovereignty to autocratic technocracy


Paper by Eric Windholz: “Emergencies require governments to govern differently. In Australia, the changes wrought by the COVID-19 pandemic have been profound. The role of lawmaker has been assumed by the executive exercising broad emergency powers. Parliaments, and the debate and scrutiny they provide, have been marginalised. The COVID-19 response also has seen the medical-scientific expert metamorphose from decision-making input into decision-maker. Extensive legislative and executive decision-making authority has been delegated to them – directly in some jurisdictions; indirectly in others. Severe restrictions on an individual’s freedom of movement, association and to earn a livelihood have been declared by them, or on their advice. Employing the analytical lens of regulatory legitimacy, this article examines and seeks to understand this shift from parliamentary sovereignty to autocratic technocracy. How has it occurred? Why has it occurred? What have been the consequences and risks of vesting significant legislative and executive power in the hands of medical-scientific experts; what might be its implications? The article concludes by distilling insights to inform the future design and deployment of public health emergency powers….(More)”.

More ethical, more innovative? The effects of ethical culture and ethical leadership on realized innovation


Zeger van der Wal and Mehmet Demircioglu in the Australian Journal of Public Administration (AJPA): “Are ethical public organisations more likely to realize innovation? The public administration literature is ambiguous about this relationship, with evidence being largely anecdotal and focused mainly on the ethical implications of business‐like behaviour and positive deviance, rather than how ethical behaviour and culture may contribute to innovation.

In this paper we examine the effects of ethical culture and ethical leadership on reported realized innovation, using 2017 survey data from the Australia Public Service Commission ( = 80,316). Our findings show that both ethical culture at the working group‐level and agency‐level as well as ethical leadership have significant positive associations with realized innovation in working groups. The findings are robust across agency, work location, job level, tenure, education, and gender and across different samples. We conclude our paper with theoretical and practical implications of our research findings…(More)”.

Google searches are no substitute for systematic reviews when it comes to policymaking


Article by Peter Bragge: “With all public attention on the COVID-19 pandemic, it is easy to forget that Australia suffered traumatic bushfires last summer, and that a royal commission is investigating the fires and will report in August. According to its Terms of Reference, the commission will examine how Australia’s national and state governments can improve the ‘preparedness for, response to, resilience to and recovery from, natural disasters.’

Many would assume that the commission will identify and use all best-available research knowledge from around the world. But this is highly unlikely because royal commissions are not designed in a way that is fit-for-purpose in the 21st century. Specifically, their terms of reference do not mandate the inclusion of knowledge from world-leading research, even though such research has never been more accessible. This design failure provides critical lessons not only for future royal commissions and public inquiries but for public servants developing policy, including for the COVID-19 crisis, and for academics, journalists, and all researchers who want to keep up with the best global thinking in their field.

The risk of not employing research knowledge that could shape policy and practice could be significantly reduced if the royal commission drew upon what are known as systematic reviews. These are a type of literature review that identify, evaluate and summarise the findings and quality of all known research studies on a particular topic. Systematic reviews provide an overall picture of an entire body of research, rather than one that is skewed by accessing only one or two studies in an area. They are the most thorough form of inquiry, because they control for the ‘outlier’ effect of one or two studies that do not align with the weight of the identified research.

Systematic reviews are known as the ‘peak of peaks’ of research knowledge

They became mainstream in the 1990s through the Cochrane Collaboration – an independent organisation originating in Britain but now worldwide — which has published thousands of systematic reviews across all areas of medicine. These and other medical systematic reviews have been critical in driving best practice healthcare around the world. The approach has expanded to business and management, the law, international development, education, environmental conservation, health service delivery and how to tackle the 17 United Nations Sustainable Development Goals.

There are now tens of thousands of systematic reviews spanning all these areas. Researchers who use them can spend much less time navigating the vastly larger volume of up to 80 million individual research studies published since 1665.

Sadly, they are not. Few policymakers, decision-makers and media are using systematic reviews to respond to complex challenges. Instead, they are searching Google, and hoping that something useful will turn up amongst an estimated 6.19 billion web pages.

The vastness of the open web is an understandable temptation for the time poor, and a great way to find a good local eatery. But it’s a terrible way to try and access relevant, credible knowledge, and an enormous risk for those seeking to address hugely difficult problems, such as responding to Australia’s bushfires.

The deep expertise of specialist professionals and academics is critical to solving complex societal challenges. Yet the standard royal commission approach of using a few experts as a proxy for the world’s knowledge is selling short both their expertise and the commission process. If experts called before the bushfire royal commission could be asked to contribute not just their own expertise, but a response to the applicability of systematic review research to Australia, the commission’s thinking could benefit hugely from harnessing the knowledge both of the reviews and of the experts…(More)”.

Innovation labs and co-production in public problem solving


Paper by Michael McGann, Tamas Wells & Emma Blomkamp: “Governments are increasingly establishing innovation labs to enhance public problem solving. Despite the speed at which these new units are being established, they have only recently begun to receive attention from public management scholars. This study assesses the extent to which labs are enhancing strategic policy capacity through pursuing more collaborative and citizen-centred approaches to policy design. Drawing on original case study research of five labs in Australia and New Zealand, it examines the structure of lab’s relationships to government partners, and the extent and nature of their activities in promoting citizen-participation in public problem solving….(More)”.

Digital human rights are next frontier for fund groups


Siobhan Riding at the Financial Times: “Politicians publicly grilling technology chiefs such as Facebook’s Mark Zuckerberg is all too familiar for investors. “There isn’t a day that goes by where you don’t see one of the tech companies talking to Congress or being highlighted for some kind of controversy,” says Lauren Compere, director of shareholder engagement at Boston Common Asset Management, a $2.4bn fund group that invests heavily in tech stocks.

Fallout from the Cambridge Analytica scandal that engulfed Facebook was a wake-up call for investors such as Boston Common, underlining the damaging social effects of digital technology if left unchecked. “These are the red flags coming up for us again and again,” says Ms Compere.

Digital human rights are fast becoming the latest front in the debate around fund managers’ ethical investments efforts. Fund managers have come under pressure in recent years to divest from companies that can harm human rights — from gun manufacturers or retailers to operators of private prisons. The focus is now switching to the less tangible but equally serious human rights risks lurking in fund managers’ technology holdings. Attention on technology groups began with concerns around data privacy, but emerging focal points are targeted advertising and how companies deal with online extremism.

Following a terrorist attack in New Zealand this year where the shooter posted video footage of the incident online, investors managing assets of more than NZ$90bn (US$57bn) urged Facebook, Twitter and Alphabet, Google’s parent company, to take more action in dealing with violent or extremist content published on their platforms. The Investor Alliance for Human Rights is currently co-ordinating a global engagement effort with Alphabet over the governance of its artificial intelligence technology, data privacy and online extremism.

Investor engagement on the topic of digital human rights is in its infancy. One roadblock for investors has been the difficulty they face in detecting and measuring what the actual risks are. “Most investors do not have a very good understanding of the implications of all of the issues in the digital space and don’t have sufficient research and tools to properly assess them — and that goes for companies too,” said Ms Compere.

One rare resource available is the Ranking Digital Rights Corporate Accountability Index, established in 2015, which rates tech companies based on a range of metrics. The development of such tools gives investors more information on the risk associated with technological advancements, enabling them to hold companies to account when they identify risks and questionable ethics….(More)”.

New Zealand launches draft algorithm charter for government agencies


Mia Hunt at Global Government Forum: “The New Zealand government has launched a draft ‘algorithm charter’ that sets out how agencies should analyse data in a way that is fair, ethical and transparent.

The charter, which is open for public consultation, sets out 10 points that agencies would have to adhere to. These include pledging to explain how significant decisions are informed by algorithms or, where it cannot – for national security reasons, for example – explain the reason; taking into account the perspectives of communities, such as LGBTQI+, Pacific islanders and people with disabilities; and identifying and consulting with groups or stakeholders with an interest in algorithm development.

Agencies would also have to publish information about how data is collected and stored; use tools and processes to ensure that privacy, ethics, and human rights considerations are integrated as part of algorithm development and procurement; and periodically assess decisions made by algorithms for unintended bias.

They would commit to implementing a “robust” peer-review process, and have to explain clearly who is responsible for automated decisions and what methods exist for challenge or appeal “via a human”….

The charter – which fits on a single page, and is designed to be simple and easily understood – explains that algorithms are a “fundamental element” of data analytics, which supports public services and delivers “new, innovative and well-targeted” policies aims.

The charter begins: “In a world where technology is moving rapidly, and artificial intelligence is on the rise, it’s essential that government has the right safeguards in place when it uses public data for decision-making. The government must ensure that data ethics are embedded in its work, and always keep in mind the people and communities being served by these tools.”

It says Stats NZ, the country’s official data agency, is “committed to transparent and accountable use of operational algorithms and other advanced data analytics techniques that inform decisions significantly impacting on individuals or groups”….(More)”.

Massive Citizen Science Effort Seeks to Survey the Entire Great Barrier Reef


Jessica Wynne Lockhart at Smithsonian: “In August, marine biologists Johnny Gaskell and Peter Mumby and a team of researchers boarded a boat headed into unknown waters off the coasts of Australia. For 14 long hours, they ploughed over 200 nautical miles, a Google Maps cache as their only guide. Just before dawn, they arrived at their destination of a previously uncharted blue hole—a cavernous opening descending through the seafloor.

After the rough night, Mumby was rewarded with something he hadn’t seen in his 30-year career. The reef surrounding the blue hole had nearly 100 percent healthy coral cover. Such a find is rare in the Great Barrier Reef, where coral bleaching events in 2016 and 2017 led to headlines proclaiming the reef “dead.”

“It made me think, ‘this is the story that people need to hear,’” Mumby says.

The expedition from Daydream Island off the coast of Queensland was a pilot program to test the methodology for the Great Reef Census, a citizen science project headed by Andy Ridley, founder of the annual conservation event Earth Hour. His latest organization, Citizens of the Great Barrier Reef, has set the ambitious goal of surveying the entire 1,400-mile-long reef system in 2020…(More)”.

Digital dystopia: how algorithms punish the poor


Ed Pilkington at The Guardian: “All around the world, from small-town Illinois in the US to Rochdale in England, from Perth, Australia, to Dumka in northern India, a revolution is under way in how governments treat the poor.

You can’t see it happening, and may have heard nothing about it. It’s being planned by engineers and coders behind closed doors, in secure government locations far from public view.

Only mathematicians and computer scientists fully understand the sea change, powered as it is by artificial intelligence (AI), predictive algorithms, risk modeling and biometrics. But if you are one of the millions of vulnerable people at the receiving end of the radical reshaping of welfare benefits, you know it is real and that its consequences can be serious – even deadly.

The Guardian has spent the past three months investigating how billions are being poured into AI innovations that are explosively recasting how low-income people interact with the state. Together, our reporters in the US, Britain, India and Australia have explored what amounts to the birth of the digital welfare state.

Their dispatches reveal how unemployment benefits, child support, housing and food subsidies and much more are being scrambled online. Vast sums are being spent by governments across the industrialized and developing worlds on automating poverty and in the process, turning the needs of vulnerable citizens into numbers, replacing the judgment of human caseworkers with the cold, bloodless decision-making of machines.

At its most forbidding, Guardian reporters paint a picture of a 21st-century Dickensian dystopia that is taking shape with breakneck speed…(More)”.

Index: Secondary Uses of Personal Data


By Alexandra Shaw, Andrew Zahuranec, Andrew Young, Stefaan Verhulst

The Living Library Index–inspired by the Harper’s Index–provides important statistics and highlights global trends in governance innovation. This installment focuses on public perceptions regarding secondary uses of personal data (or the re-use of data initially collected for a different purpose). It provides a summary of societal perspectives toward personal data usage, sharing, and control. It is not meant to be comprehensive–rather, it intends to illustrate conflicting, and often confusing, attitudes toward the re-use of personal data. 

Please share any additional, illustrative statistics on data, or other issues at the nexus of technology and governance, with us at info@thelivinglib.org

Data ownership and control 

  • Percentage of Americans who say it is “very important” they control information collected about them: 74% – 2016
  • Americans who think that today’s privacy laws are not good enough at protecting people’s privacy online: 68% – 2016
  • Americans who say they have “a lot” of control over how companies collect and use their information: 9% – 2015
  • In a survey of 507 online shoppers, the number of respondents who indicated they don’t want brands tracking their location: 62% – 2015
  • In a survey of 507 online shoppers, the amount who “prefer offers that are targeted to where they are and what they are doing:” 60% – 2015 
  • Number of surveyed American consumers willing to provide data to corporations under the following conditions: 
    • “Data about my social concerns to better connect me with non-profit organizations that advance those causes:” 19% – 2018
    • “Data about my DNA to help me uncover any hereditary illnesses:” 21% – 2018
    • “Data about my interests and hobbies to receive relevant information and offers from online sellers:” 32% – 2018
    • “Data about my location to help me find the fastest route to my destination:” 40% – 2018
    • “My email address to receive exclusive offers from my favorite brands:”  56% – 2018  

Consumer Attitudes 

  • Academic study participants willing to donate personal data to research if it could lead to public good: 60% – 2014
  • Academic study participants willing to share personal data for research purposes in the interest of public good: 25% – 2014
  • Percentage who expect companies to “treat [them] like an individual, not as a member of some segment like ‘millennials’ or ‘suburban mothers:’” 74% – 2018 
    • Percentage who believe that brands should understand a “consumer’s individual situation (e.g. marital status, age, location, etc.)” when they’re being marketed to: 70% – 2018 Number who are “more annoyed” by companies now compared to 5 years ago: 40% – 2018Percentage worried their data is shared across companies without their permission: 88% – 2018Amount worried about a brand’s ability to track their behavior while on the brand’s website, app, or neither: 75% – 2018 
  • Consumers globally who expect brands to anticipate needs before they arise: 33%  – 2018 
  • Surveyed residents of the United Kingdom who identify as:
    • “Data pragmatists” willing to share personal data “under the right circumstances:” 58% – 2017
    • “Fundamentalists,” who would not share personal data for better services: 24% – 2017
    • Respondents who think data sharing is part of participating in the modern economy: 62% – 2018
    • Respondents who believe that data sharing benefits enterprises more than consumers: 75% – 2018
    • People who want more control over their data that enterprises collect: 84% – 2018
    • Percentage “unconcerned” about personal data protection: 18% – 2018
  • Percentage of Americans who think that government should do more to regulate large technology companies: 55% – 2018
  • Registered American voters who trust broadband companies with personal data “a great deal” or “a fair amount”: 43% – 2017
  • Americans who report experiencing a major data breach: 64% – 2017
  • Number of Americans who believe that their personal data is less secure than it was 5 years ago: 49% – 2019
  • Amount of surveyed American citizens who consider trust in a company an important factor for sharing data: 54% – 2018

Convenience

Microsoft’s 2015 Consumer Data Value Exchange Report attempts to understand consumer attitudes on the exchange of personal data across the global markets of Australia, Brazil, Canada, Colombia, Egypt, Germany, Kenya, Mexico, Nigeria, Spain, South Africa, United Kingdom and the United States. From their survey of 16,500 users, they find:

  • The most popular incentives for sharing data are: 
    • Cash rewards: 64% – 2015
    • Significant discounts: 49% – 2015
    • Streamlined processes: 29% – 2015
    • New ideas: 28% – 2015
  • Respondents who would prefer to see more ads to get new services: 34% – 2015
  • Respondents willing to share search terms for a service that enabled fewer steps to get things done: 70% – 2015 
  • Respondents willing to share activity data for such an improvement: 82% – 2015
  • Respondents willing to share their gender for “a service that inspires something new based on others like them:” 79% – 2015

A 2015 Pew Research Center survey presented Americans with several data-sharing scenarios related to convenience. Participants could respond: “acceptable,” “it depends,” or “not acceptable” to the following scenarios: 

  • Share health information to get access to personal health records and arrange appointments more easily:
    • Acceptable: 52% – 2015
    • It depends: 20% – 2015
    • Not acceptable: 26% – 2015
  • Share data for discounted auto insurance rates: 
    • Acceptable: 37% – 2015
    • It depends: 16% – 2015
    • Not acceptable: 45% – 2015
  • Share data for free social media services: 
    • Acceptable: 33% – 2015
    • It depends: 15% – 2015
    • Not acceptable: 51% – 2015
  • Share data on smart thermostats for cheaper energy bills: 
    • Acceptable: 33% – 2015
    • It depends: 15% – 2015
    • Not acceptable: 51% – 2015

Other Studies

  • Surveyed banking and insurance customers who would exchange personal data for:
    • Targeted auto insurance premiums: 64% – 2019
    • Better life insurance premiums for healthy lifestyle choices: 52% – 2019 
  • Surveyed banking and insurance customers willing to share data specifically related to income, location and lifestyle habits to: 
    • Secure faster loan approvals: 81.3% – 2019
    • Lower the chances of injury or loss: 79.7% – 2019 
    • Receive discounts on non-insurance products or services: 74.6% – 2019
    • Receive text alerts related to banking account activity: 59.8% – 2019 
    • Get saving advice based on spending patterns: 56.6% – 2019
  • In a survey of over 7,000 members of the public around the globe, respondents indicated:
    • They thought “smartphone and tablet apps used for navigation, chat, and news that can access your contacts, photos, and browsing history” is “creepy;” 16% – 2016
    • Emailing a friend about a trip to Paris and receiving advertisements for hotels, restaurants and excursions in Paris is “creepy:” 32% – 2016
    • A free fitness-tracking device that monitors your well-being and sends a monthly report to you and your employer is “creepy:” 45% – 2016
    • A telematics device that allows emergency services to track your vehicle is “creepy:” 78% – 2016
  • The number of British residents who do not want to work with virtual agents of any kind: 48% – 2017
  • Americans who disagree that “if companies give me a discount, it is a fair exchange for them to collect information about me without my knowing”: 91% – 2015

Data Brokers, Intermediaries, and Third Parties 

  • Americans who consider it acceptable for a grocery store to offer a free loyalty card in exchange for selling their shopping data to third parties: 47% – 2016
  • Number of people who know that “searches, site visits and purchases” are reviewed without consent:  55% – 2015
  • The number of people in 1991 who wanted companies to ask them for permission first before collecting their personal information and selling that data to intermediaries: 93% – 1991
    • Number of Americans who “would be very concerned if the company at which their data were stored sold it to another party:” 90% – 2008
    • Percentage of Americans who think it’s unacceptable for their grocery store to share their shopping data with third parties in exchange for a free loyalty card: 32% – 2016
  • Percentage of Americans who think that government needs to do more to regulate advertisers: 64% – 2016
    • Number of Americans who “want to have control over what marketers can learn about” them online: 84% – 2015
    • Percentage of Americans who think they have no power over marketers to figure out what they’re learning about them: 58% – 2015
  • Registered American voters who are “somewhat uncomfortable” or “very uncomfortable” with companies like Internet service providers or websites using personal data to recommend stories, articles, or videos:  56% – 2017
  • Registered American voters who are “somewhat uncomfortable” or “very uncomfortable” with companies like Internet service providers or websites selling their personal information to third parties for advertising purposes: 64% – 2017

Personal Health Data

The Robert Wood Johnson Foundation’s 2014 Health Data Exploration Project Report analyzes attitudes about personal health data (PHD). PHD is self-tracking data related to health that is traceable through wearable devices and sensors. The three major stakeholder groups involved in using PHD for public good are users, companies that track the users’ data, and researchers. 

  • Overall Respondents:
    • Percentage who believe anonymity is “very” or “extremely” important: 67% – 2014
    • Percentage who “probably would” or “definitely would” share their personal data with researchers: 78% – 2014
    • Percentage who believe that they own—or should own—all the data about them, even when it is indirectly collected: 54% – 2014
    • Percentage who think they share or ought to share ownership with the company: 30% – 2014
    • Percentage who think companies alone own or should own all the data about them: 4% – 2014
    • Percentage for whom data ownership “is not something I care about”: 13% – 2014
    • Percentage who indicated they wanted to own their data: 75% – 2014 
    • Percentage who would share data only if “privacy were assured:” 68% – 2014
    • People who would supply data regardless of privacy or compensation: 27% – 2014
      • Percentage of participants who mentioned privacy, anonymity, or confidentiality when asked under what conditions they would share their data:  63% – 2014
      • Percentage who would be “more” or “much more” likely to share data for compensation: 56% – 2014
      • Percentage who indicated compensation would make no difference: 38% – 2014
      • Amount opposed to commercial  or profit-making use of their data: 13% – 2014
    • Percentage of people who would only share personal health data with a guarantee of:
      • Privacy: 57% – 2014
      • Anonymization: 90% – 2014
  • Surveyed Researchers: 
    • Percentage who agree or strongly agree that self-tracking data would help provide more insights in their research: 89% – 2014
    • Percentage who say PHD could answer questions that other data sources could not: 95% – 2014
    • Percentage who have used public datasets: 57% – 2014
    • Percentage who have paid for data for research: 19% – 2014
    • Percentage who have used self-tracking data before for research purposes: 46% – 2014
    • Percentage who have worked with application, device, or social media companies: 23% – 2014
    • Percentage who “somewhat disagree” or “strongly disagree” there are barriers that cannot be overcome to using self-tracking data in their research: 82% – 2014 

SOURCES: 

“2019 Accenture Global Financial Services Consumer Study: Discover the Patterns in Personality”, Accenture, 2019. 

“Americans’ Views About Data Collection and Security”, Pew Research Center, 2015. 

“Data Donation: Sharing Personal Data for Public Good?”, ResearchGate, 2014.

Data privacy: What the consumer really thinks,” Acxiom, 2018.

“Exclusive: Public wants Big Tech regulated”, Axios, 2018.

Consumer data value exchange,” Microsoft, 2015.

Crossing the Line: Staying on the right side of consumer privacy,” KPMG International Cooperative, 2016.

“How do you feel about the government sharing our personal data? – livechat”, The Guardian, 2017. 

“Personal data for public good: using health information in medical research”, The Academy of Medical Sciences, 2006. 

“Personal Data for the Public Good: New Opportunities to Enrich Understanding of Individual and Population Health”, Robert Wood Johnson Foundation, Health Data Exploration Project, Calit2, UC Irvine and UC San Diego, 2014. 

“Pew Internet and American Life Project: Cloud Computing Raises Privacy Concerns”, Pew Research Center, 2008. 

“Poll: Little Trust That Tech Giants Will Keep Personal Data Private”, Morning Consult & Politico, 2017. 

“Privacy and Information Sharing”, Pew Research Center, 2016. 

“Privacy, Data and the Consumer: What US Thinks About Sharing Data”, MarTech Advisor, 2018. 

“Public Opinion on Privacy”, Electronic Privacy Information Center, 2019. 

“Selligent Marketing Cloud Study Finds Consumer Expectations and Marketer Challenges are Rising in Tandem”, Selligent Marketing Cloud, 2018. 

The Data-Sharing Disconnect: The Impact of Context, Consumer Trust, and Relevance in Retail Marketing,” Boxever, 2015. 

Microsoft Research reveals understanding gap in the brand-consumer data exchange,” Microsoft Research, 2015.

“Survey: 58% will share personal data under the right circumstances”, Marketing Land: Third Door Media, 2019. 

“The state of privacy in post-Snowden America”, Pew Research Center, 2016. 

The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers And Opening Them Up to Exploitation”, University of Pennsylvania, 2015.