#HTE
Why Does Facebook Think I’m Interested in Becoming Eastern Orthodox?
Welcome to the second installment of Ask Dr. Strangedata, in which our resident data-mining expert David Auerbach, who previously worked in the software bowels of Google and Microsoft, answers your questions about how Internet companies know so much about you. Why is your crazy ex suddenly popping up on your suggested friends? How did Facebook know that you like Antiques Roadshow? Why did Facebook ask if you had a Chinese name? Dr. Strangedata has the—or at least a—diagnosis. (Please note that David is an engineer, not a doctor, damn it.)
Dear Dr. Strangedata,
Facebook keeps suggesting that I join groups that would require some really radical life change on my part. Suggestions have included converting to the Eastern Orthodox church, going vegan, and becoming a “crafty mama.”
Do my posts somehow imply I am dissatisfied with my life the way it is? I can’t imagine it implies I’m interested in these topics.
—Mostly Content
Dear MC,
While my vegan friends would insist that going vegan is not a radical life change, I understand that these suggestions might seem a little random and extreme. The thing is that they probably seem that way to Facebook as well! This gets into the issue of what I term algorithmic opacity, the increasing occurrence of algorithmic results that are difficult to explain even by the authors of the code. There are so many inputs going into the algorithm that determine group recommendations that the logic forms a kind of secret sauce, where a large number of factors tilt the results in one direction or another without any one factor being the clear cause. Now, sometimes, the determining factor is obvious: If I like the page of Japanese noise rock band Melt-Banana, I won’t be too surprised to get a recommendation to join Japanese noise auteur Otomo Yoshihide.
But in cases like yours, it could be any number of factors: You may have crafty mama friends who are interested in veganism or Eastern Orthodoxy; you may happen to have more prosaic shared interests (specific hobbies, a favorite set of movies, crunchy politics, etc.) with other people you don’t know whohappen to be vegan or Eastern Orthodox or crafty mamas; or vegan Eastern Orthodox crafty mamas might be common in your geographical location. Because there are so many inputs, each recommendation has its own particular recipe for how it was suggested to you. There’s not a huge downside for Facebook for showing you these suggestions; worst case, you just don’t click on them—though certainly the risk of offensive suggestions is there. Yet even bad suggestions help Facebook learn more about you. Rather than being a suggestion, think of it, rather, as Facebook conducting an experiment on you to see if you are a vegan, Eastern Orthodox, or a crafty mama. If you don’t click, you’re still telling Facebook something about yourself. In this case, absence of evidence is evidence of absence.
Dear Dr. Strangedata,
Why does Facebook suggest my patients for me to friend? I don’t use Facebook on a work computer, I don’t have their phone numbers in my contacts, but on multiple occasions I have thought names or faces look familiar only to realize I have treated them in my office.
—Don’t want to be a HIPAA-crite
Dear DWTBAH,
Many of the questions I get revolve around people being unnerved that Facebook has connected them with people with whom they have sensitive relationships, like doctors and patients. The crux of this issue is that Facebook can make friend suggestions not just based on your contacts, but on others’ contacts. What’s most likely is that your patients had contact entries for you in their contacts, containing either your phone number or email address, or simply your name. When Facebook saw your patients’ contacts, it identified you as knowing them. So not only did it suggest to them that they add you, but it suggested to you that you add them.
Facebook wants to make its graph of connections as dense and rich as possible, hence why it makes suggestions so aggressively. But in cases like this, the line between social encouragement and a violation of privacy starts to blur a bit. Should Facebook be able to infer that your patients might have seen you by virtue of them having you in their contacts? Facebook, of course, is not tryingto figure out which people are seeing which doctors, and I highly doubt it is keeping track of such data. On the other hand, it’s very helpful to advertisers to know which “microsegment” a person falls into in order to better target ads, and here the line between demographic and sensitive information begins to blur a bit. Let’s say you’re an OB-GYN, and Facebook determines that many of the patients connected to you share the commonality of being very interested in maternity products and falling into the “middle-class expectant mother” microsegment. While Facebook still knows nothing directly about the nature of the relationship between you and your patients, its algorithms have still learned something that is a consequence of that relationship. And people who end up with that microtargeting data and the friends’ graphs may be able to infer that relationship in the future. (This would be a case of what law professor Frank Pasquale terms “runaway data” in his book The Black Box Society.)
Companies try to balance data collection while being sensitive to the creepiness factor of seeming to know all too much about you. In 2012, Charles Duhigg chronicled the story of a man angrily confronting Target over its sending his teenage daughter coupons for baby clothing: “Are you trying to encourage her to get pregnant?” No—in fact, Target’s marketing research department had studied her purchases and determined (correctly, as it turned out) that she already was pregnant. Target doesn’t send out expectant mother coupons quite so readily anymore, you can be sure of that. Yet Target hasn’t stopped collecting all that data; it’s just more careful about disguising the creep factor. A Target executive told Duhigg, “We found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. … As long as we don’t spook her, it works.” Managing the perceptionof being spied on has become a crucial part of advertising. The data is still out there; given the will, almost any large company could determine far more about you than it currently does. We depend on its benevolence.
Dear Dr. Strangedata,
Almost every day, I get friend requests from people I have never met in real life, and the vast majority of them are foreign. Many of them are from the former Soviet Union, where I have spent some time professionally, but I do not know them. Is this normal to get lots of friend requests from strangers?
—Friend Magnet
Dear FM,
The answer is, I’m afraid, yes. I get a fair number of random requests myself, usually from accounts with a handful of totally random friends. This is usually a good sign that the account is fake and possibly a spammer. While Facebook does make a point to try to crack down on fake accounts and spammers, many still slip through the cracks. If you happen to accept one of these requests, you may find that they post phishing links to your wall. Some of my friends have fallen victim to some of these and have ended up inadvertently spreading a spam link. While Facebook offers the feature to restrict friend requests to people who know your email address or even just friends of friends, the default setting is considerably more public (since Facebook does want to grow that friends graph). So here I can only counsel caution: That gorgeous model who just added you out of nowhere may not be interested in your scintillating status updates but rather in getting you to click on a dangerous link.
That’s it for this time. If you have a question about a creepy social media experience, email me at dr.strangedata@gmail.com. Until next time, remember: When you use a free Internet service, you aren’t the customer, you are the product!
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.
http://www.slate.com/articles/technology/future_tense/2016/01/why_is_facebook_suggesting_these_weird_groups.html