Alena Buyx, Barbara Prainsack and Aisling McMahon at The Conversation: “Mustafa loves good coffee. In his free time, he often browses high-end coffee machines that he cannot currently afford but is saving for. One day, travelling to a friend’s wedding abroad, he gets to sit next to another friend on the plane. When Mustafa complains about how much he paid for his ticket, it turns out that his friend paid less than half of what he paid, even though they booked around the same time.
He looks into possible reasons for this and concludes that it must be related to his browsing of expensive coffee machines and equipment. He is very angry about this and complains to the airline, who send him a lukewarm apology that refers to personalised pricing models. Mustafa feels that this is unfair but does not challenge it. Pursuing it any further would cost him time and money.
This story – which is hypothetical, but can and does occur – demonstrates the potential for people to be harmed by data use in the current “big data” era. Big data analytics involves using large amounts of data from many sources which are linked and analysed to find patterns that help to predict human behaviour. Such analysis, even when perfectly legal, can harm people.
Mustafa, for example, has likely been affected by personalised pricing practices whereby his search for high-end coffee machines has been used to make certain assumptions about his willingness to pay or buying power. This in turn may have led to his higher priced airfare. While this has not resulted in serious harm in Mustafa’s case, instances of serious emotional and financial harm are, unfortunately, not rare, including the denial of mortgages for individuals and risks to a person’s general credit worthiness based on associations with other individuals. This might happen if an individual shares some similar characteristics to other individuals who have poor repayment histories….(More)”.