Plaintext readers take note: Recalls don’t work. In case you were thinking that I should step down. Plus, I’ve never even been to the French Laundry.
Almost 4 billion people use Facebook and its properties, like Instagram and WhatsApp. The company has extensive data on all those users, including their personal information and logs with every single thing they do on the services—as well as things they sometimes do outside the services. Early on, the company understood the value of this unprecedented cache of information. In 2006 it hired data scientist Jeff Hammerbacher to create an infrastructure that could efficiently mine that data. The successor to this system is now the playground for hundreds of researchers.
Usually, the results of the thousands of internal studies and experiments these researchers perform remain in the background. Facebook isn’t often transparent on how they affect—or don’t affect—the product that users see. But this week Facebook’s data whizzes took center stage when The Wall Street Journal ran a sensational series largely based on leaked internal presentations by researchers. One dealt with the implications of a secret white list of millions of people deemed exempt from Facebook’s content rules. Another exposed that the algorithms Facebook created to increase well-being on the platform actually made people angrier. Perhaps the most damning involved a study that indicated that Instagram negatively affected the self-image of millions of teenage girls on the service, and was a menace to their mental health. (The timing was not ideal for Instagram’s leader, Adam Mosseri, who on the night of publication was parading on the Met Gala red carpet in a tuxedo that looked like it was pilfered from a harlequin’s closet.) The articles charged that despite these blazing red flags, Mark Zuckerberg and his brain trust chose not to address the problems. At the least, this information about Facebook that was of great public interest was kept inside the company.
We need to talk about Facebook research. I’ve spoken to some of those researchers both on and off the record, and I have been impressed at their quality and dedication. “If you put Facebook’s researchers in any university, they would be an elite data science faculty,” says Nathaniel Persily, a professor at Stanford Law School who has worked closely, though not always successfully, with the researchers in an attempt to make Facebook data public. The stalwarts among them believe their work helps unearth truths about social media that allows the company to address the problems, improving the lives of millions of users. Others get satisfaction in just making Facebook work better.
There’s also a Faustian aspect of their work. They have access to an unprecedented archive charting the behavior of billions, a Disneyland of data. The trade-off is that their work is proprietary. While researchers do sometimes get to publish their results, the default is that it remains internal. Facebook keeps promising transparency, but it has never come close to providing it. One event in particular led to a chronic wariness about sharing: the shocked reaction to a published 2014 report known as the “Emotion study.” The research project was lambasted because it ran afoul of traditional ethics by manipulating the News Feed of users who did not know that they were subjects of an experiment to see if certain posts made them feel bad. Facebook’s takeaway was that sharing too much information about its research results could lead to trouble. “They're still very constrained,” says Persily. “What they can make public is conditional on the whims of their superiors.”
Facebook keeps promising to let outsiders perform research on its data, but the history of this effort has also been rocky. Just last week, outside researchers involved in Social Science One, a painstakingly assembled effort to share Facebook’s political data, discovered that the data sets Facebook provided were fatally flawed. An outraged Persily, who formerly headed Social Science One, is now working with legislators on a bill that would force Facebook to share critical data externally.
Still, the researchers who thrive at Facebook say they are fine with the constraints. While the goal in academia is to increase knowledge, their goal is to make a difference in a product that affects millions of people.
But what happens when Facebook blows past the results of their studies? It’s telling that Facebook Research has always been part of the company’s growth organization, whose mission is to recruit and retain users, sometimes at the expense of ethics and societal benefit. As the Journal report indicates, when the researchers expose uncomfortable facts, their results aren’t necessarily embraced. Some of the slides in the presentation deck by the Instagram researchers seem like desperate pleas for change. The boldface title of one slide reads, “One in five teens say that Instagram makes them feel worse about themselves.” Another reads, “Teens who struggle with mental health say Instagram makes it worse.”
After publication, Mosseri tweeted a link to this defense by Karina Newman, Instagram’s public policy director. She says that internal studies in subjects like bullying, suicide, and eating disorders have led to improvements in the product, and she argues that the Journal’s report takes the documents out of context.
Still, if, for instance, a vaccine study showed that a fifth of those inoculated had serious negative side effects, that vaccine would immediately be deep-sixed. But, as the Journal tells it, when Facebook’s researchers reported that Instagram harmed the mental health of millions of teenage girls, Zuckerberg and his team didn’t make any drastic changes—instead, they went ahead with plans to figure out how to extend the product to preteens. And they kept the studies secret. No one knew about them—until they were leaked.
That’s the paradox for the data analysts, statisticians, and social scientists who work in Facebook research. Much of their work in, say, well-being may help to make the platform a healthier place. But as the cases exposed this week show, there are times when making those changes might negatively affect the business—and business considerations win. Facebook is not sited in academia, but in the marketplace.
This fact was not lost on Jeff Hammerbacher, the father of Facebook research, who left the company after only two years. He later explained why: “It was turning from a place to explore to a place to exploit.” In another interview, he unleashed a quote that continues to haunt researchers at Facebook. “The best minds of my generation are thinking about how to make people click ads.”
In 2015 I wrote about a Facebook Research project that involved actually asking users in Knoxville, Tennessee, what kinds of stories they wanted to see in their News Feeds. At the time, Adam Mosseri, who now heads Instagram, was on the News Feed team:
“We really try to not express any editorial judgment,” says Adam Mosseri, the News Feed product director. “We might think that Ferguson is more important than the Ice Bucket Challenge, but we don’t think we should be forcing people to eat their vegetables, even though we may or may not think vegetables are healthy.”
According to Mosseri, Facebook wants News Feed to do three things: “One is connecting you to your friends and family—that’s the fundamental value that Facebook is based on,” he says. “The second is to inform about things that you might be interested in, whether it’s news or sports scores or how to wash your jeans. And the third thing is to entertain you, whether it’s to make you laugh or show you videos or trailers.” By pigeonholing content in those buckets, he says, Facebook is able to judge which kinds of stories its raters welcome, and why …
One of the big questions, of course, is whether this study would indicate that people are yearning for more meaningful stories—the kale smoothies. So far, the answer is no. “If anything it’s the inverse,” says Mosseri. “When we asked what are the best stories, ones people said they really want to see, the highest percentage of impact type is a strong emotional reaction. People really want to see stuff that drives a laugh or makes them feel happy, not necessarily information that’s super valuable.” On a story-by-story basis this instant-gratification impulse may even overwhelm a desire to hear news about people in one’s social graph … “If you ask people about each story individually, they’re going to naturally rate emotional reactions really highly, which is what we’re seeing,” he says.
James writes, “Self-driving cars are going about it all the wrong way. Why not replace the driver with a remote professional driver?”
Thanks for the question, James. Remote pilots may work for drones, but I don’t see it as a solution for the rest of us. For one thing a “remote professional driver” has to be paid, so this makes no economic sense. Every time I get in my car I need to hire a driver? This would also have safety issues. There’s this thing called latency, which means that if a moose ambles into your path, the professional driver in some studio—who may be distracted by listening to a podcast anyway—might not be able to react in time to avoid said moose. And what if the connection goes down? So … no. We’ll have to wait until our cars get smarter.
But I do think your idea has some merit as part of a system where cars don’t have steering wheels and everyone is a passenger. When something goes wrong, or the car gets itself into a jam that its AI can’t figure out, maybe there can be a panic button where a remote driver can take over, just long enough to direct it to a high-tech repair center. But in the meantime, don’t fall asleep at the wheel of your “autopiloted” Tesla.
You can submit questions to mail@wired.com. Write ASK LEVY in the subject line.
So, Greenland, you think that your Arctic-ness protects you from hurricanes? Hurricane Larry had other ideas, with 100-mile-an-hour winds and an early September blizzard. But that won’t stop you from melting.
Apple’s fall event unveiled new iterations of iPads, the Apple Watch, and iPhones. But the coolest thing was a feature on the latter called Cinematic Mode, which makes anyone into a movie auteur.
Greg LeMond, who raced up many hills while winning three Tours de France, now wants to help you race up hills—on a spiffy ebike he’s designing.
The next frontier in brain hacking is “neurograins,” tiny sensors spread like salt on your cortex. But getting them inside your skull is no picnic.
Update your devices. Now.
- 📩 The latest on tech, science, and more: Get our newsletters!
- The mission to rewrite Nazi history on Wikipedia
- Red Dead Redemption's Wild West is a refuge
- 6 things you need to do to prevent getting hacked
- How to turn your favorite web apps into desktop apps
- In Kenya, influencers are hired to spread disinformation
- 👁️ Explore AI like never before with our new database
- 🎮 WIRED Games: Get the latest tips, reviews, and more
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers