It all began in February of 2016 when Mark Zuckerberg sent a memo to all Facebook employees regarding a behavior that caught his attention. The message, sent through emails, was about the notes and scribbles that are posted to the walls of the company’s Menlo Park headquarters. Staffers are encouraged to post on these walls, and at least on a couple of occasions, someone crossed out the words “Black Lives Matter” and replaced them with “All Lives Matter.” Zuckerberg, addressed in the memo that such behavior must stop.
“‘Black Lives Matter’ doesn’t mean other lives don’t,” he wrote. “We’ve never had rules around what people can write on our wall,” said in the memo. But “crossing out something means silencing speech, or that one person’s speech is more important than another’s.” The incident was being investigated, he said.
Issues about race and politics has been increasingly debated these days across the US. Some are about Donald Trump winning the South Carolina primary, Trump lashing out to the Pope over immigration, and him getting the enthusiastic support of David Duke. Hillary Clinton won over Bernie Sanders in Nevada, just to have a Black Lives Matter activist interrupt her speech and brought up Hilary’s racial statements two decades ago. While on Facebook, a group called Blacktivist was flaring up tension by spreading messages like “American economy and power were built on forced migration and torture.”
News of such kinds gathered attention, if not tension, so when Zuckerberg’s memo circulated, Benjamin Fearnow, a young contract worker thought it was newsworthy. He decided to get a screenshot on his laptop, and forwarded the image to Michael Nuñez, who at that time was working at the tech-news site, Gizmodo. Shortly, Nuñez published a story about Zuckerberg’s message.
A week after the memo was sent, Fearnow, again took a screenshot of another forwarded email to employees. In the internal communication, the email was asking FB employees to submit questions to Zuckerberg at an all-hands meeting. One of the most-polled inquiries said, “What responsibility does Facebook have to help prevent President Trump in 2017?” Fearnow, again, took a screenshot of this email using his phone.
A newly graduate of the Columbia Journalism School, Fearnow worked at Facebook’s New York office, where he was assigned to an area called Trending Topics. Trending Topics is a feed where popular news subjects were found, and popped up when someone opened FB. The feed was generated by an algorithm, but was moderated by 25 people who have backgrounds in Journalism. For instance, if the word “Trump” becomes trending, people working on Trending Topics use their news judgment to identify which stories have the most value. If hoax sites published fake news that went viral, the team made sure it doesn’t reach the feed. If there’s a highly important news like mass shooting, but the Facebook algorithm was slow to pick it up, they would inject a story about it in the feed.
Although Facebook is proud to say that they have people who love to work for them, Fearnow and his team didn’t feel that way. They were contract employees who were hired by a company called BCforward, and every workday was a reminder that they weren’t really part of Facebook. Being a contract employee made them aware that their jobs were doomed from the start. For people working at tech companies, they know this similar sight: tech companies want to have the smallest help of humans because people don’t scale. Tech companies can’t hire tons of them because in a few times, humans proved they are more intrusive than algorithms. Employees need bathroom breaks, they need health insurance, most especially, they can talk to the press – that for a fact, can be vexing. It didn’t take long when Fearsome and his team assumed that algorithm would take over the whole job, and they would be unnecessary.
It was Friday when Fearnow took the last screenshot. The next day, when he woke up after sleeping in, he had about 30 meeting notifications from Facebook. He replied using his phone telling them it was his day off, but was nonetheless asked to attend in 10 minutes. Shortly, he was on a video-conference with three FB employees, including Sonya Ahuja, the company’s head of investigations. Ahuja asked him if he had been in touch with Nuñez — Fearnow denied. However, the head of investigations said that she had their messages on Gchat, which Fearnow had assumed couldn’t be accessed by Facebook. On that same meeting, Fearnow got fired. “Please shut your laptop and don’t reopen it,” Ahuja instructed him.
Ahuja reached out to another Trending Topics employee that same day. Ryan Villarreal works on the same team with Fearnow, who several years ago, had shared an apartment with Nuñez and Fearnow. Villareal said he didn’t take any screenshots, and did not leak any. However, he said, he clicked ‘like’ on a story about Black Lives Matter, and that he’s friends with Nuñez on Facebook. “Do you think leaks are bad?” Ahuja wanted to know, said Villareal. He was fired, too. The last time he heard from BCforward was when the company sent an email addressing the $15 they had given to Villareal, and wanted to get it back.
Eliminating Fearnow and Villareal in the company even pushed the Trending Topics team on the edge. Nuñez looked for more dirt. Soon he published a story about the internal poll about Facebook employees’ interest in defending Trump. Fearnow also published a story in May about a conversation that happened with a former Trending Topics employee, and titled it with “Former Facebook Workers: We Routinely Suppressed Conservative News.” The piece was about the Trending Topics team working like a Fox News dream, but a bunch of biased curators who ‘inject’ liberal stories, and ‘blacklist’ conservative news in the News feed. Shortly after publishing, the piece emerged on tech websites, such as Drudge Report and Breitbart News.
Nuñez’ post turning viral was just the surface of what Facebook had to deal with for two years. What ensued from the Trending Topics battle added greatly to the most turbulent years of the social media giant, resulting in a chain of events that distracted and confused the company while being engulfed by larger disasters.
The following are stories from various Facebook employees who insisted keeping their real names unknown. Although the tales varied, all had a similar plot: a humongous company, its CEO, and how their techno-optimism had been crushed as they understood the many ways their platform can be used for ill purposes.
For some countries, Facebook has become the internet. The platform’s never-ending growth, expanded fast and effectively, from being a connection tool to get in touch with students at Harvard, to being the bridge that links one school to another, to a global means that links the whole world. FB has become your log in for many sites. Its Messenger app continues to compete with emailing and texting. The social network is also the place to share your thoughts, or to notify your friends and family if you’re safe from an earthquake.
Facebook is moving fast, and fearless. The company is determined to stay afloat the competition, and more importantly, to keep competing. The social giant keeps on getting larger by engulfing other competitors. It didn’t take long when Facebook got into the news as well. In 2012, it wasn’t the social network that stood out as the most exciting network for online news distribution — it was Twitter. During that time, Twitter’s influence in the news industry was heavier than FB. “Twitter was this massive, massive threat,” according to a former Facebook executive, who by that time was heavily involved in the decision making of the company.
Facebook couldn’t buy Twitter. So they have gone for a strategy that they usually employ for the competitors they could not lure into selling: copy and then crush. Zuckerberg decided to adjust FB’s News feed in a way that it could fully integrate news. FB created some tweaks so it showed the author by lines as well as the titles. Then, Facebook employees talked to other journalists about how they could reach a bigger audience for news through an online platform. In 2013, the social network was able to double its shares of news sites, which was able to push Twitter into a downward slope. FB has also surpassed Google by the middle of 2015, and became the leading platform that referred readers to publisher sites. By that time, Facebook was referring readers to sites 13 times as many as Twitter. In that same year, FB also created Instant Articles, where publishers could directly write and share news. Articles that were done using the platform loaded faster and looked sharper. The catch: publishers would give up a part of control over their content. The publishing industry agreed. FB then, effectively owned news.
“If you could reproduce Twitter inside of Facebook, why would you go to Twitter?” says the former executive. “What they are doing to Snapchat now, they did to Twitter back then.”
However, Facebook must have not thought it through. The news industry is much more complicated than it seemed. Becoming the dominant force in the news industry requires knowledge about accuracy, quality and rules, such as elimination of pornography, and copyright protection. FB hired a few journalists, and talked minimally about what should be done. The company should have spent more time discussing the big questions that torment the media industry. ‘Is it fair?’, ‘Is it fact?’ these are just a few significant questions that need to be laid out on the table. Hopping in the news industry, FB must be able to identify the small difference between news, opinion, satire and analysis. Perhaps, the company didn’t see those elements. Perhaps they saw the company as the one having immunity from all those technicalities. Besides, they used to be just a tech company that offers a platform for all ideas.
Facebook is an open neutral, platform. This is the main philosophy that the company believes in. When new hires come in, they are oriented by Chris Cox, the social network’s chief product officer, who claims that the company is the 21st communications platform, like the telephone in the 20th century. If anyone knows about Section 230 of the 1996 Communications Decency Act, they know that this part of American law protects internet intermediaries from liability of what the users post. However, if Facebook may begin creating or editing the posts that appear in the news feed, the company could lose its immunity.
Facebook never attempted to take any one’s side when it comes to publishing sites. The company doesn’t want to be regulated; thus, they always try to be neutral. However, choosing to be neutral is still a choice. Now that FB decided to moderate what people see in the News feed– whether it’s a picture of your dog, or a news article from The Washington Post, the New York Post, or filtered hoax post from the Denver Guardian – roughly, it has the same point. The social network responded that it’s democratized information. Technically, the company says, users see what other people share, and not what some editor at a New York tower wanted them to see. Still, having control over Newsfeed moderation, can be counted as editorial.
As Facebook hopped in the news industry, the platform had as well become a connective space between readers and publishers. The platform is where Macedonian teens could connect with American voters, and where Saint Petersburg operatives could connect with their audiences in a way FB had never seen before.
February 2016, in the middle of the Trending Topics disaster, Roger McNamee noticed something strange on the platform. McNamee is one of the early insiders of Facebook. He’s an investor in the company, and had mentored Zuckerberg under two crucial circumstances: turning down Yahoo’s $1 billion offer to acquire the social network in 2006; and hiring Sheryl Sandberg, a Google executive in 2008. Since then, McNamee had not been in touch with Zuckerberg for a long time, though, was still an investor. McNamee noticed memes related to Bernie Sanders campaign, which he found concerning.
“I’m observing memes ostensibly coming out of a Facebook group associated with the Sanders campaign that couldn’t possibly have been from the Sanders campaign,” he said, “and yet they were organizing and spreading in such a way that suggested somebody had a budget. And I’m sitting there thinking, ‘That’s really weird. I mean, that’s not good.’ ”
McNamee didn’t alarm FB employees about his observations on the News feed. Besides, the company wasn’t getting any suspicious hints on their radar. In early 2016, Facebook’s security team noticed Russian actors attempting to steal the credentials of other journalists and public figures. They forwarded the information to the FBI, but that was it. The social network never heard back from the government.
In the spring of 2016, FB worked hard fending off accusations that the company might influence the results of the election. Once Gizmondo published the story about the Trending Topics bias, it dispersed fast like a plague rapidly infecting one person to another. It didn’t take long until the article reached millions of readers. But it wasn’t bad press that shook Facebook – it was a letter from John Thune, a Republican US senator from South Dakota. Thune is a member of the Senate Commerce Committee that supervises the Federal Trade Commission, an agency that has been actively investigating FB. In the letter, Thune wanted the company to answer the accusations of bias.
Facebook responded fast. It gathered its Washington officials who met up with Thune, and it sent a 12-page single-spaced letter, bearing an explanation saying the company had conducted an investigation of the Trending Topics, and found that the allegations in the Gizmondo’s story were false.
The company also decided to organize a show, where they invited a group of 17 prominent Republicans to Menlo Park. The company’s aim was to apologize for their mistakes, bring in a group of conservatives who would debate over the regulation of the platform, and to make sure that the attendees were “bored to death” by a presentation, after Zuckerberg and Sandberg addressed the group.
The meeting was a success – all goals were tapped. The guests did fight, and there was no unification between groups that could be a threat to the company. After the show, Glenn Beck, one of the guests wrote an essay about it, admiring Zuckerberg. “I asked him if Facebook, now or in the future, would be an open platform for the sharing of all ideas or a creator of content,” Beck wrote. “Without hesitation, with clarity and boldness, Mark said there is only one Facebook and one path forward: ‘We are an open platform.’”
The Trending Topics fiasco resulted in a genuine search for solutions in Facebook’s end. In late June, after other projects have failed to offer practical and fair answers to the company’s conundrum, the company had announced a change: the algorithm will be prioritizing posts from friends and family. Adam Mosseri, FB’s News Feed executive, also posted a statement titled “Building a Better News Feed for Yourself.” It revealed a rough gist of how the algorithm works, and outsiders saw it as: FB, saying they oppose clickbait but never favored any kind of viewpoints.
The most valuable lesson, perhaps, that Facebook learned from the Trending Topics controversy was that the company became careful of its actions, especially if it has to do with conservative news.
In an annual conference hosted by billionaire Herb Allen held in Sun Valley, Idaho, Zuckerberg attended and other moguls who plan to buy each other’s company. Rupert Murdoch disrupted the mood with a conversation with Robert Thomson, the CEO of News Corp, and Zuckerberg. Murdoch and Thomson brought up their long-held dismay with Google and Facebook for taking over almost the entire digital ad market. They said, the two tech giants have become a threat to journalism. Murdoch and Thomson accused the social network of making major changes in their algorithm without consulting to its media partners. The two News Corps conveyed that if Facebook won’t offer better deals to the publishing industry, the executives could become more public in their denunciations and lobbying. The News Corps had made things difficult for Google in Europe in the past, which they conveyed, they can do again for FB in the US.
Zuckerberg thought the executives were threatening to drive the government for an antitrust investigation, or perhaps initiate an inquiry whether Facebook deserves its immunity from liability for being a neutral platform. Zuckerberg knew he had to take the threats seriously, especially because he knows Murdoch’s skills in the dark arts. Murdoch can use papers, TVs, and other connections to amplify criticisms, which he already did in 2007. During that time, FB was criticized by 49 state attorneys for failing to protect young users from sexual predators and inappropriate content. Letters written by concerned parents were sent to attorney Richard Blumenthal, who led the investigation, and to the New York Times, which published a story. However, according to one Facebook executive, many of the user accounts and letters about predatory behaviors were fake, which were traced to News Corp lawyers or other employees of Murdoch, who was also the owner of MySpace, one of FB’s largest competitors.
“We traced the creation of the Facebook accounts with IP addresses at the Apple store a block away from the MySpace offices in Santa Monica,” the FB executive says. “Facebook then traced interactions to those accounts to News Corp lawyers. When it comes to FB, Murdoch has been playing every angle he can for a long time.”
The time Zuckerberg arrived back from Sun Valley, he wanted a change. Facebook, by that time, wasn’t in the news business yet, but Zuckerberg said they had to make sure there’s one. Andrew Anker was one of FB’s team members who got assigned with new tasks. Anker was a product manager who’d been with the social network since 2015 after his journalism career. One tick-off box in Anker’s list was to think of a way that publishers could make money using the platform. Shortly, Anker met with Zuckerberg asking to hire 60 people who would work on partnerships with the news industry. Anker’s request got approved.
The plan didn’t turn out good. More people involved and talking didn’t fix Murdoch’s financial problems. Publishers spent millions on creating content that the social network benefited from, yet the company wasn’t giving enough back. Instant Articles didn’t give much help as well. Facebook reached a dead end trying to come up with solutions for Murdoch’s concern. And though the company became more aware of the concerns of journalists, it didn’t reach its own Trending Topics team. In August, everyone in the team was told their jobs were being eliminated.