India’s importance for Facebook is a double-edged sword: Whistle-blower Sophie Zhang

ILLUSTRATION BY SHAGNIK CHAKRABORTY
07 June, 2022

Sophie Zhang—a former data scientist at Facebook, which has since been re-branded to Meta—joined the organisation in January 2018 and worked on a team addressing fake engagement on the platform. During this time, Zhang worked outside of her job profile and uncovered several inauthentic networks across the world, which were being used to influence political outcomes in numerous countries such as India, Azerbaijan, Honduras, Afghanistan, Ukraine and Taiwan, among others. Zhang’s relentless work, in not just identifying such inauthentic behaviour, but also pushing for institutional action against it, initially won her praise from her colleagues. By mid-August 2020 though, Zhang was fired for poor performance, as a result, The Guardian reported, “of her spending too much time focussed on uprooting civic fake engagement and not enough time on the priorities outlined by management.” She turned down a $64,000 severance package from the organisation to be able to speak about her revelations.

In interviews conducted in November 2021 and May 2022, with Nikita Saxena, a contributing writer at The Caravan, Zhang recounted her repeated attempts to initiate action against a network associated with the BJP MP Vinod Sonkar; the effect of inauthentic behaviour on democracies; and the disproportionate responsibility that she believed Facebook’s systems could put on individuals. Facebook’s response to this interview has been added below.


Nikita Saxena: Could you talk about your decision to share the documentation with Indian news publications and reveal the identity of BJP MP Vinod Kumar Sonkar, after you waited to depose before the Lok Sabha?
Sophie Zhang: I don’t think I can tell you too much about my interactions with the Lok Sabha, because from what I understand, it is protected legally under parliamentary privilege. But it’s been publically reported that I offered this documentation the Lok Sabha more than half a year ago at this point. It’s been publically reported that they accepted it and examined it. It’s been publically reported that they voted unanimously to request my testimony. That’s essentially the state it has been in for the last half year.

And so, that is why, I have chosen to once again, take the next step forward, and take this issue directly to the Indian people, via the press.

Saxena: Could you tell us about your work at Facebook?
Zhang: I worked at Facebook from January 2018 to September 2020. I was a data scientist, I was paid to look at numbers, figure out what the numbers meant, and tell people what they meant. I was on the fake-engagement team. The team has officially changed its name since, but it was fake-engagement for most of my time there.

By fake, I mean, for instance, inauthentic accounts pretending to be people who don’t exist; accounts of real people that have been hacked and taken over; or, if I persuade you to give over your credentials to your account.

The average person, when they start hearing about this, they immediately begin thinking about IT cells, they think about fake political activities, perhaps Russian activity, perhaps foreign interference. The thing to realise is that most people are not politicians, and most people, most activity on the internet, is not political.

Unfortunately, a practice that is common in India, and much of the global south, is known as Autolikers—in which people are asked to go through steps in order to receive free Likes. What many do not realise is that by going through these steps they are giving over their credentials to shady third parties. These outsiders therefore gain personal control of their account, and can use their account to do things however they wish. Eventually, their credentials and access may, for instance, be sold to an IT cell or something.

People engage in this practice thinking they are innocuously gaining more popularity, not realising they are contributing to the degradation of Indian democracy and the erosion of civic discourse.

My role was defined widely, but what I was asked to focus on was the source of Autolikers and script activity. It was not my job to deal with for instance, political IT cells. These may sound very similar to the layperson, but they are very different in terms of scope, sophistication and importance.

Everything you have heard about my work, including in India, none of that was what I was supposed to be doing as my day job. That was work I was doing in my spare time, that was officially in my area, but not what I was expected to do.

Saxena: Could you give us a sense of some of the networks for which you were trying to push for action, across the world, and in India?
Zhang: I worked on something like three dozen different countries, India was one of them. I used Google Translate and Wikipedia because I knew nothing about these countries. Usually, I would raise the situation and prioritise it based on size, importance, scope, sophistication et cetera. Whether this was chosen to be acted on was entirely up to others who responded to it, and also at how much I chose to yell at them to work on it.

The most concerning results they found were in Honduras and Azerbaijan, which are small countries, but they caught the governments of those countries red-handed. The government was essentially paying people to become an IT cell, without even hiding their activity. I thought, “Okay, this is so obviously bad, I’ll just hand it over, someone else can take care of it, and I can get back to my actual job.” Instead, it took almost a year for Honduras, it took more than a year for Azerbaijan. But in the worst-case scenario, action was never taken. That of course occurred with the network of BJP accounts.

The difference between Azerbaijan and Honduras and India, is that India is a very large and important nation. Facebook ascribed an importance to India, that wasn’t the case in Azerbaijan and Honduras. On the flip-side, because of that importance, there was a lot more political interference. There were many more concerns expressed against taking action regarding accounts connected to Indian politicians, or benefitting them, than accounts associated with the government of Honduras or Azerbaijan. The importance is essentially a double-edged sword.

Saxena: Could you talk about the categories of behaviours that these accounts were engaged in? For instance, you have differentiated between inauthentic behaviour and what is commonly known as misinformation.
Zhang: So, misinformation has absolutely nothing to do with fake accounts and inauthenticity. To the average person this may sound similar, but they’re completely different. Suppose someone on the internet says, “Islam and Hinduism are the same religion.” This is obvious misinformation. It depends only on the words; it does not depend on who is saying it.

Meanwhile, inauthenticity is entirely a function of who is saying it. If I sent 10,000 fake accounts onto the internet to say, “Hindus are great people, there is nothing wrong with Hindus.” There is nothing wrong with this statement, but it is still using fake accounts, and therefore Facebook will be correct to take the accounts down. It has nothing to do with what the accounts are saying; it has everything to do with the accounts that are being used. Just as ballot-box stuffing is illegal and bad regardless of who you are stuffing boxes for.

I was making an analogy to ballot-box stuffing because the voice of the people cannot be heard if they are drowned out by fake masses. With regards to misinformation, hate speech, there are genuine questions about: should this be taken down; where should we draw the line; do we want to censor voices; we need to protect freedom of speech.

People have a right to freedom of speech, random Facebook accounts do not. It is necessary to protect discourse from fake accounts, from letting politicians control hundreds of fake voices, just like it’s necessary to protect democracy from politicians stuffing ballot boxes with hundreds of fake votes.

Saxena: Between late-2019 and early-2020, you found five separate networks of inauthentic accounts across the political spectrum in India. These were benefitting the Congress, the Aam Aadmi Party and the Bharatiya Janata Party?
Zhang: In most cases, it is very difficult to ascribe responsibility. We know who benefits, but we don’t know who is responsible. And I’m going to use an example: I’m sure The Caravan has a Facebook page or something.

Suppose, tomorrow, The Caravan suddenly begins getting 10,000 fake likes on each of its posts. If someone reports this, people will probably assume that The Caravan was responsible. But in this case, there is no direct evidence for that. And there a lot of possible explanations. Perhaps, the fake activity was from someone who sent it to the wrong address. Perhaps, the ID was off from yours by a number, and they typoed it. Perhaps, someone did this for themselves and said, “To cover my tracks, I should send this to random people—that way if I get caught, I can say: Look, they got it too. I had nothing to with this, either.” Maybe The Caravan hired a social media manager and had no idea what this social media manager was doing and she decided this was a great idea. Maybe The Caravan had some random supporter who thought, “I want to help The Caravan, what should I do, well obviously this will be a great idea.” Or maybe, a competitor at a different news outlet, who is about to call up the Indian police and say, “The Caravan benefits from the IT cell,” and frames you.

So, in most cases it is impossible to know who is responsible, unfortunately. I have tried very hard to focus on the cases in which I do know who is responsible. So, when I talk about most of the fake account networks, I talked about who they were benefitting, not who they were from, because in most cases we don’t know who they were from.

Saxena: Some of the networks in India were found in the run-up to the Delhi elections in February 2020? 
Zhang: Yes, one of the networks was acting to manipulate discourse in the run-up to the Delhi elections. That was the pro-AAP network. These pro-AAP accounts were choosing to support the AAP and oppose the BJP in the Delhi election. But they were doing it in a very specific way. They were all choosing to pretend to be pro-BJP people, who chose to support the AAP in the Delhi elections. They were saying slogans like, “I voted for Modi last year to stop corruption, now I am voting for Kejriwal in Delhi to stop corruption. Modi is getting it done in India, now Kejriwal needs to get it done in Delhi.”

People are more likely to believe others who they have some common ground with. If you are a BJP supporter, you will be more convinced to change your vote by listening to a BJP supporter than by an INC supporter. This is one tactic that can be used by fake accounts; by choosing intentionally to have fake common ground with the target audience you can increase the appeal of the message.

Saxena: Once you identified these networks, some of them were taken off. You began encountering problems on the decision to take down one particular network associated with a BJP leader?
Zhang: I was able to have four of the five networks taken down. The fifth one, we had permission to take it down, they were about to do so, but they stopped because they realised it was linked to a sitting member of the Lok Sabha. From then on, I could not get a straight answer as to what should be done about this.

I constantly asked for an answer. For instance, I was acting to protect the Delhi elections from a pro-AAP network of fake accounts—this is a network that is benefitting the AAP. The argument I always used internally was, “If you are taking down this pro-AAP network, then we should also really take down this network that is supporting the BJP in this other area, otherwise you will open yourself up to accusations that we are politically-biased, that we take down these accounts but not others.” Of course, this became a self-fulfilling prophecy. I am making that accusation now, but we didn’t know that at the time.

If you ask someone something and they don’t respond, maybe they didn’t hear you. You send them an email, maybe they didn’t see the email. You send them more emails, maybe they didn’t see those either. You’re in a conversation with them, you are talking about something else, you bring up this thing, they go back to the other thing. This keeps going on. At some point it becomes clear that somewhere, something very suspicious is going on. I can try and make my best judgement—my best judgement is that it is extraordinarily suspicious.

To me, there are two possible explanations. The first one is, people were incompetent and they actually forgot and ignored everything. The hypothesis that seems more likely to me is that people were reluctant to answer because they were in a difficult position—they couldn’t say yes, and they didn’t want to say no, because of the terrible optics of saying no.

Saxena: This reluctance was concerning to you since this was an account connected to a politician, or someone associated with them?
Zhang: When I say I caught the politician, I mean that I caught someone with access to the politician’s personal Facebook account. I’m using the politician himself as a shorthand for that phrase.

It was very concerning to myself, because ultimately, we cannot have a society where we have one set of rules for the common people and impunity for the influential. That is a hallmark of dictatorship, such as the People’s Republic of China. There is an infamous quote by the former president of Peru, Oscar Benavides: “For my friends everything, for my enemies, the law.”

I don’t know ultimately what happened in this case. It is no secret that Facebook has been criticised in many regards regarding its policy in India and the decisions it has made there. I have heard similar criticisms from many other people within the company. 

Saxena: So, you were not getting any responses in the case of the networks associated with the BJP MP?
Zhang: I did not get significant responses. I repeatedly asked for responses, and for the most, did not receive one. In contrast, when I caught the government of Azerbaijan red-handed, it was very silly, because the Eastern European team said that it was the Middle East team’s problem, the Middle East team said that it was the Eastern European team’s problem—none of them could agree who owned it. Eventually, they figured out that Azerbaijan was officially in the Turkey region, which was more interested in the actual nation of Turkey, and did not care about Azerbaijan.

But there were people at Facebook who cared about the relationships with Indian politicians. It was their job to do so.

Saxena: In a sense, the voice of the Indian voter seems absent from some of the policy concerns?
Zhang: That is by design. At Facebook, the people charged with policy decisions—what the rules on the platform are, what action is taken, whether action is taken or not—these are the same people who are charged with keeping good relationships with governments, with lobbying politicians. This creates a natural conflict of interest.

If a judge is called upon to try a case, and he finds that, “Oh, wait, I go out for weekly lunches with the defendant, he’s a good friend of mine.” At that point, the judge will be required to recuse himself. At Facebook, this is a feature not a bug. This is unusual as I understand it, even among technology companies. The way in which Facebook departs from other companies—which makes political interference intentional, and integrated into the scope of Facebook, and widely accepted.

Saxena: In your testimony to the British parliament, you said that when researchers were pointing out fake accounts that were listed to political figures, they were far less likely to be taken down as compared to those not linked to political actors.
Zhang: This was my personal experience. It certainly created a perverse effect. Ultimately, the rule of law should be blind—justice should not depend on who the defendant or recipient is.

That creates an incentive for important people to conduct violating activity without hiding the connections to themselves, which is entirely opposite of how a society of laws should work. A democratic society cannot afford to have one set of laws for the influential and powerful, and one for everyone else, but that is what happens at Facebook.

Saxena: Could you talk about the larger harms of such inauthentic networks on a democracy?
Zhang: I think it’s very telling that one Indian politician chose to use his valuable time doing this; that multiple political parties benefited from them. That multiple national governments worldwide—Honduras and Azerbaijan, in particular—were involved in this activity. I don’t know precisely what benefit they gained from this. What I can say for certain is that the people who had the best ability to judge whether this was impactful believed that this was very valuable for them.

Unfortunately, Indian politics seems to be undergoing an IT cell arms race—in which each political party feels that they cannot unilaterally disarm or the opponents will gain an advantage. Further complicating the situation is that Facebook appears to feel comfortable taking down IT cells linked to other political parties, but is hesitant with regards to the BJP, which creates an unfair playing field.

Democracy is ultimately preserved by the approval and consent of the people. And that is extremely difficult to continue when the voices of the people are increasingly being drowned out by a hired crowd dedicated to the politician. In the real world, if you want to have a hundred thousand people supporting you to show up to a rally, they have to be a hundred thousand actual people. In contrast, in the digital world, a thousand can get away with doing that without too much difficulty. So, what we are finding is that dictatorships worldwide are using social media, its advantages, in ways that dictatorships in the past did not have.

Saxena: You have spoken about the disproportionate responsibility such a system places on individuals within organisations. 
Zhang: When I talk about my work to protect the Delhi elections from the pro-AAP network of fake accounts, that was not my official job. That was work that I had been officially ordered to stop already, under the premise that it was not important to Facebook. There had been orders to stop doing it and I had argued that I was finishing a successful investigation. Because this was work that I was doing in my spare time, it was essentially up to myself to decide what was important enough that I needed to yell about it, to make it a priority.

There is an expression, “judge, jury and executioner,” for someone who makes a decision and tries a case unilaterally with large consequences—I decided I would never be that. I would always make sure that there was someone else to confirm my investigation; someone else to decide whether to act; someone else to decide to take the account down.

But even though there were other people investigating and other people deciding to act, the decision was still effectively on my shoulders. Because whenever people investigated my results, they always said, “Yes, this is bad.” And whenever I got a straight answer from people about whether we should act or not it was almost always, “Yes, we should act.” So, whether something happened was entirely a function of how loudly I yelled about it to happen.

I chose to prioritise India and what I found there. There were countries that I did not prioritise because I was spending time on India instead. I can only apologise profusely to those countries. That should never have been my decision, but it was nevertheless.

Saxena: This is not the kind of decision any individual within an organisation this large should be in a position to make, since the processes then depends entirely on how these individuals decide to take them forward?
Zhang: The shortest time it took to get a response was less than a day, in the case of Poland. I raised it the night after Christmas. By the time I woke up the next morning, the Polish employee I had contacted was naturally very concerned about a politician running fake accounts in his country and took the fake accounts down without asking anyone. Policy, I think was pretty upset for doing this. They were like, “You should have checked with us first. This should have been our decision. He is important, what if he complains.”

There are supposed to be processes, but often times it is up to the decisions of individuals. When this happens often-times, it’s a reflection of a broader failure: that the system is not working. 

Saxena: And such a case-by-case approach, as you have repeatedly pointed out, is not ideal, especially in a political context.
Zhang: I fought during my time at Facebook for getting official support and getting an official framework for this. Instead, I was told to stop or I would be fired, so Facebook made its own decision about how important it was.

At the end of the day, Facebook is a company whose goal is to make money. And to the extent that they care about this—about democracy and discourse in India—is because Mark [Zuckerberg] is human and needs to sleep at night, and also because, letting bad things happen affects his ability to make money.

For inauthentic activity, there is also a specific, additional area that makes public reaction ineffective at getting Facebook to respond. The purpose of inauthentic activity is to not be seen, to pretend to be real. Ultimately, to find these inauthentic networks, Facebook is really the only organisation that has the tools to do it. But Facebook has no incentive to do so.

Imagine a world in which the Bhopal chemical disaster happened, but in this world, only Union Carbide knew who did it, and only Union Carbide had any hope of knowing who did it. In that case, it would be extremely important for someone from within the company to come forward, and tell the world what they know. That is precisely what I am trying to do right now.

Saxena: What pushed you to prioritise the inauthentic networks you found in India?
Zhang: I prioritised based on a combination of factors: the size and scale of the activity; the prominence of the beneficiary; news articles regarding the country. What I found in India was larger than what I found in many countries, which is not surprising because India is of course, a very large country.

Sometimes, I would question my decisions afterwards. For instance, in the fall of 2019, while I prioritised India, I chose not to prioritise activities that I found in Bolivia, that were very small in volume, that were supporting the Bolivian opposition candidate. Because of its small size, I decided that it was not important, and I stand by that decision. But with that said, soon after the Bolivian presidential election, there were mass protests in Bolivia, alleging that the election was rigged. The government was overthrown in what has been called both, either a coup d’état or a popular uprising. And so, I can’t say I would have made the same decision if I had looked ahead into the future and known what would happen.

Saxena: What is the role that local organisations from different countries play in pushing for the identification or monitoring of such content?
Sophie Zhang: A lot of the investigations that Facebook does are in response to outside groups. For instance, perhaps a news agency goes to Facebook and says, “This is going on.” Perhaps an opposition group, an NGO says that something is going on that is unusual. This approach naturally works better in countries that have robust internal systems. In dictatorships, such as Azerbaijan, there is essentially no opposition group, no NGOs, no groups to hold the government responsible. And so, having employees and partnerships can be beneficial in that regard.

On the flip-side, it can also open up the company to more political interference. For instance, recently in Russia, Russians succeeding in pressuring multiple social media companies to take down content attempting to organise opposition supporters on how to vote. And the Russian government did this, as I understand it, by threatening to arrest employees of those companies who were physically present within the Russian federation.

So, this is a tool that is very vulnerable to being exploited by dictatorships and authoritarian governments throughout the world. And I am sorry to say that the Indian government has also used this.

Saxena: Were there other researchers who were also working on co-ordinated inauthentic behaviour in India?
Zhang: There are people who work on co-ordinated inauthentic behaviour as a full-time job. These are highly-trained specialists. They are the equivalent of India’s R&AW or the equivalent of India’s IB. Actually, many of them, I am told, were former specialists at intelligence agencies such as the CIA [Central Intelligence Agency] or the NSA [National Security Agency].

These people don’t focus on individual countries, they focus globally, world-wide. They put out monthly reports for co-ordinated inauthentic behaviour that you may have heard of. You may have also noticed that the activity I had taken down in India, did not show up in any of these reports. That is because it was easier for me to take them down for inauthentic behaviour, rather than co-ordinated inauthentic behaviour.

Saxena:  What are the specific questions that Facebook needs to answer with reference to revelations that you’ve brought to light regarding India?
Zhang: Whenever people ask Facebook for responses, it likes to focus on generalities because it knows it cannot afford to answer on the specifics. I am expecting them to give you a generic answer like, “We fundamentally disagree with Sophie on this, we have invested a large amount in India, India is important to us, we have taken down these many fake account networks in India.” They will refuse to actually answer the question, because they know that they cannot afford to engage on it. But companies can get away with saying this to reporters, who credulously accept these statements.

If you read The Guardian article, you would see that they change it. First, they said, “This did not happen, we took down the fake accounts associated with the BJP politician’s account in December 2019.” They were forced to retract this statement after The Guardian showed them documentation proving that this was not the case—case documentation that I have also offered to the Lok Sabha. Next, they offered a different statement. They said a different team took this down in May of 2020. They took down some of the fake accounts.

They have phrased this in a way that I cannot directly dispute this, because according to them a different team did it. What I can say is that according to that statement made by Facebook, it took an extra half year to take down this network, and they only took down some of the accounts according to Facebook’s own statement. And so, I pointed that out to other journalists, and then Facebook told them, ‘No, that is not correct,’ and that it didn’t take any extra time.

So, my question to Facebook right now is, “What actually is your current statement and why did you change it so many times? Did it take extra time, if so, why did you lie at first? If not, why did you lie later? Why did it take extra time? Why did you change your story so many times and why can you not give the Indian people a straight answer on this question?”

 

In response to a questionnaire sent in March 2022 a Meta Spokesperson said, “We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform. We aggressively go after abuse around the world and have specialized teams focused on this work. As a result, we’ve already taken down more than 150 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world, including those in India. Combatting coordinated inauthentic behavior is our priority. We’re also addressing the problems of spam and fake engagement. We investigate each issue before taking action or making public claims about them. We apply our policies uniformly without regard to anyone’s political positions or party affiliations. The decisions around integrity work or content escalations cannot and are not made unilaterally by just one person; rather, they are inclusive of different views from around the company, a process that is critical to making sure we consider, understand and account for both local and global contexts.”

In response to a question about the specialist team involved in the review of the accounts from the network benefitting Sonkar in May 2019, according to its claims to The Guardian, and whether this review and enforcement had been recorded in the internal task management system, the spokesperson provided a single-line response. “Every content enforcement decision is taken by an XFN team including subject matter experts.” XFN is short-hand for cross-functional. When asked if Zhang was ever asked to stop the work she was doing on inauthentic behaviour related to political actors, the spokesperson said, “We do not comment on the role or scope of work of any specific current or ex-employee.”

 

This interview has been edited and condensed.

This interview forms a part of a series on the Facebook Files.