Brian Fung in the Washington Post: “It turns out that OkCupid has been performing some of the same psychological experiments on its users that landed Facebook in hot water recently.
In a lengthy blog post, OkCupid cofounder Christian Rudder explains that OkCupid has on occasion played around with removing text from people’s profiles, removing photos, and even telling some users they were an excellent match when in fact they were only a 30 percent match according to the company’s systems. Just to see what would happen.
OkCupid defends this behavior as something that any self-respecting Web site would do.
“OkCupid doesn’t really know what it’s doing. Neither does any other Web site,” Rudder wrote. “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”…
we have a bigger problem on our hands: A problem about how to reconcile the sometimes valuable lessons of data science with the creep factor — particularly when you aren’t notified about being studied. But as I’ve written before, these kinds of studies happen all the time; it’s just rare that the public is presented with the results.
Short of banning the practice altogether, which seems totally unrealistic, corporate data science seems like an opportunity on a number of levels, particularly if it’s disclosed to the public. First, it helps us understand how human beings tend to behave at Internet scale. Second, it tells us more about how Internet companies work. And third, it helps consumers make better decisions about which services they’re comfortable using.
I suspect that what bothers us most of all is not that the research took place, but that we’re slowly coming to grips with how easily we ceded control over our own information — and how the machines that collect all this data may all know more about us than we do ourselves. We had no idea we were even in a rabbit hole, and now we’ve discovered we’re 10 feet deep. As many as 62.5 percent of Facebook users don’t know the news feed is generated by a company algorithm, according to a recent study conducted by Christian Sandvig, an associate professor at the University of Michigan, and Karrie Karahalios, an associate professor at the University of Illinois.
OkCupid’s blog post is distinct in several ways from Facebook’s psychological experiment. OkCupid didn’t try to publish its findings in a scientific journal. It isn’t even claiming that what it did was science. Moreover, OkCupid’s research is legitimately useful to users of the service — in ways that Facebook’s research is arguably not….
But in any case, there’s no such motivating factor when it comes to Facebook. Unless you’re a page administrator or news organization, understanding how the newsfeed works doesn’t really help the average user in the way that understanding how OkCupid works does. That’s because people use Facebook for all kinds of reasons that have nothing to do with Facebook’s commercial motives. But people would stop using OkCupid if they discovered it didn’t “work.”
If you’re lying to your users in an attempt to improve your service, what’s the line between A/B testing and fraud?”