Skip to Content

She risked everything to expose Facebook. Now she’s telling her story.

Sophie Zhang, a former data scientist at Facebook, revealed that it enables global political manipulation and has done little to stop it.

Sophie ZhangSophie Zhang
Christie Hemm Klok
July 29, 2021

The world first learned of Sophie Zhang in September 2020, when BuzzFeed News obtained and published highlights from an abridged version of her nearly 8,000-word exit memo from Facebook.

Before she was fired, Zhang was officially employed as a low-level data scientist at the company. But she had become consumed by a task she deemed more important: finding and taking down fake accounts and likes that were being used to sway elections globally.

Her memo revealed that she’d identified dozens of countries, including India, Mexico, Afghanistan, and South Korea, where this type of abuse was enabling politicians to mislead the public and gain power. It also revealed how little the company had done to mitigate the problem, despite Zhang’s repeated efforts to bring it to the attention of leadership.

LISTEN TO THE STORY HERE

“I know that I have blood on my hands by now,” she wrote.

On the eve of her departure, Zhang was still debating whether to write the memo at all. It was perhaps her last chance to create enough internal pressure on leadership to start taking the problems seriously. In anticipation of writing it, she had turned down a nearly $64,000 severance package that would have involved signing a nondisparagement agreement. She wanted to retain the freedom to speak critically about the company.

But it was just two months before the 2020 US election, and she was disturbed by the idea that the memo could erode the public’s trust in the electoral process if prematurely released to the press. “I was terrified of somehow becoming the James Comey of 2020,” she says, referring to the former FBI director who, days before the 2016 election, told Congress the agency had reopened an investigation into Hillary Clinton’s use of a private email server. Clinton went on to blame Comey for her loss.

To Zhang’s great relief, that didn’t happen. And after the election passed, she proceeded with her original plan. In April, she came forward in two Guardian articles with her face, her name, and even more detailed documentation of the political manipulation she’d uncovered and Facebook’s negligence in dealing with it.

Her account supplied concrete evidence to support what critics had long been saying on the outside: that Facebook makes election interference easy, and that unless such activity hurts the company’s business interests, it can’t be bothered to fix the problem.

In a statement, Joe Osborne, a Facebook spokesperson, vehemently denied these claims. “For the countless press interviews she’s done since leaving Facebook, we have fundamentally disagreed with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform,” he said. “We aggressively go after abuse around the world and have specialized teams focused on this work. As a result, we’ve already taken down more than 150 networks of coordinated inauthentic behavior … Combatting coordinated inauthentic behavior is our priority.”

By going public and eschewing anonymity, Zhang risked legal action from the company, harm to her future career prospects, and perhaps even reprisals from the politicians she exposed in the process. “What she did is very brave,” says Julia Carrie Wong, the Guardian reporter who published her revelations.

After nearly a year of avoiding personal questions, Zhang is now ready to tell her story. She wants the world to understand how she became so involved in trying to protect democracy worldwide and why she cared so deeply. She’s also tired of being in the closet as a transgender woman, a core aspect of her identity that informed her actions at Facebook and after she left.

Her story reveals that it is really pure luck that we now know so much about how Facebook enables election interference globally. Zhang was not just the only person fighting this form of political manipulation; it wasn’t even her job. She had discovered the problem because of a unique confluence of skills and passion, and then taken it upon herself, driven by an extraordinary sense of moral responsibility.

To regulators around the world considering how to rein in the company, this should be a wake-up call.

Zhang never planned to be in this position. She’s deeply introverted and hates being in the limelight. She’d joined Facebook in 2018 after the financial strain of living on part-time contract work in the Bay Area had worn her down. When she received Facebook’s offer, she was upfront with her recruiter: she didn’t think the company was making the world better, but she would join to help fix it.

“They told me, ‘You’d be surprised how many people at Facebook say that,’” she remembers.

It was easier said than done. Like many new hires, she’d joined without being assigned to a specific team. She wanted to work on election integrity, which looks for ways to mitigate election-related platform abuse, but her skills didn’t match their openings. She settled for a new team tackling fake engagement instead.

Fake engagement refers to things such as likes, shares, and comments that have been bought or otherwise inauthentically generated on the platform. The new team focused more narrowly on so-called “scripted inauthentic activity”—fake likes and shares produced by automated bots and used to drive up someone’s popularity.

In the vast majority of such cases, people were merely obtaining likes for vanity. But half a year in, Zhang intuited that politicians could do the same things to increase their influence and reach on the platform. It didn’t take long for her to find examples in Brazil and India, which were both preparing for general elections.

In the process of searching for scripted activity, she also found something far more worrying. The administrator for the Facebook page of the Honduran president, Juan Orlando Hernández, had created hundreds of pages with fake names and profile pictures to look just like users—and was using them to flood the president’s posts with likes, comments, and shares. (Facebook bars users from making multiple profiles but doesn’t apply the same restriction to pages, which are usually meant for businesses and public figures.)

The activity didn’t count as scripted, but the effect was the same. Not only could it mislead the casual observer into believing Hernández was more well-liked and popular than he was, but it was also boosting his posts higher up in people’s newsfeeds. For a politician whose 2017 reelection victory was widely believed to be fraudulent, the brazenness—and implications—were alarming.

“Everyone agreed that it was terrible. No one could agree who should be responsible, or even what should be done.”

But when Zhang raised the issue, she says, she received a lukewarm reception. The pages integrity team, which handles abuse of and on Facebook pages, wouldn’t block the mass manufacture of pages to look like users. The newsfeed integrity team, which tries to improve the quality of what appears in users’ newsfeeds, wouldn’t remove the fake likes and comments from the ranking algorithm’s consideration. “Everyone agreed that it was terrible,” Zhang says. “No one could agree who should be responsible, or even what should be done.”

After Zhang applied pressure for a year, the network of fake pages was finally removed. A few months later, Facebook created a new “inauthentic behavior policy” to ban fake pages masquerading as users. But this policy change didn’t address a more fundamental problem: no one was being asked to enforce it.

So Zhang took the initiative herself. When she wasn’t working to scrub away vanity likes, she diligently combed through streams of data, searching for the use of fake pages, fake accounts, and other forms of coordinated fake activity on politicians’ pages. She found cases in dozens of countries, most egregiously in Azerbaijan, where the pages technique was being used to harass the opposition.

But finding and flagging new cases wasn’t enough. Zhang found that in order to get any networks of fake pages or accounts removed, she had to persistently lobby the relevant teams. In countries where such activity posed little PR risk to the company, enforcement could be put off repeatedly. (Facebook disputes this characterization.) The responsibility weighed on her heavily. Was it more important to push for a case in Bolivia, with a population of 11.6 million, or in Rajasthan, India, with a population close to 70 million?

Then, in the fall of 2019, weeks of deadly civil protest broke out in Bolivia after the public contested the results of its presidential election. Only a few weeks earlier, Zhang had indeed deprioritized the country to take care of what seemed like more urgent cases. The news consumed her with guilt. Intellectually, she knew there was no way to draw a direct connection between her decision and the events. The fake engagement had been so minor that the effect was likely negligible. But psychologically and emotionally, it didn’t matter. “That’s when I started losing sleep,” she says.

Whereas someone else might have chosen to leave such a taxing job or perhaps absolve herself of responsibility as a means of coping, Zhang leaned in, at great personal cost, in an attempt to singlehandedly right a wrong.

Over the year between the events in Bolivia and her firing, the exertion sent her health into sharp decline. She already suffered from anxiety and depression, but it grew significantly—and dangerously—worse. Always a voracious reader of world news, she could no longer distance herself from the political turmoil in other countries. The pressure pushed her away from friends and loved ones. She grew increasingly isolated and broke up with her girlfriend. She upped her anxiety and antidepressant medication until her dose had increased sixfold.