Internet ‘deepfakes’ threaten truth and reality

The World
A man sits in front of two computer monitors.

The rise of “deepfakes” — digitally doctored video or audio clips — is causing increasing concern over what is truth and reality on the internet. Deepfakes allow people to digitally alter video or audio in order to make it look or sound as if teir subject is doing or saying anything the creator wants. They can mimic politicians, celebrities and everyday people to spread false information.  

Congress held its first-ever hearing on deepfakes on Thursday. Democrat Rep. Adam Schiff of California painted a picture of possible “nightmarish scenarios.” For example, a state-backed actor could create a deepfake video of a political candidate accepting a bribe, with the goal of influencing an election. Or an individual hacker could claim to have stolen audio of a private conversation between two world leaders when in fact no such conversation took place.

Deepfakes can be used for other nefarious purposes as well, including targeting women. Rana Ayyub is just one victim of deepfakes. She’s a journalist based in Mumbai, and after she wrote a critical article about India’s ruling BJP party last year, she soon was facing a nightmare.  

Rana Ayyub: Suddenly I get a message on my phone on WhatsApp saying, “I’m going to send you something, it’s going to be very disturbing. Please don’t react.” The moment he sent the video, I saw the first few frames and I just froze. I could not watch it beyond three or four frames because it was a porn video and there was my image morphed in it. It felt like, there I was, out there in public domain naked, and I just froze.

Carol Hills: So, someone had taken images of your face and superimposed them on an actual porn video?

Yes, that was the case.

To the average person, if they looked at it, they would think it was you in the video?

Yes, if the average person saw it without going through the details, such as the hair is not curly and the body is not mine, the average person would just look at the face and say, “Oh, that’s Rana Ayyub.” That’s what bothered me. Within hours, I was receiving screenshots of the video on my WhatsApp, Twitter and Facebook. I felt like I was naked for the world. I was throwing up, I was in the hospital, I had palpitations for two days and my blood pressure shot up. I just couldn’t stop crying.

You yourself are a somewhat public figure; you’re an investigative journalist. So, it was also an attack on you as a professional, a way to undermine you.

Well, it was. When the government could not deal with me through facts, they decided to discredit me, because the video was being shared by members of the ruling body on their Facebook and Twitter. Clearly this was a messaging by the ruling party that because I did not toe their line; they are going to discredit me. For the next two to three months, I did not go on Twitter or Facebook. I remained silent because I was scared that if I did something, someone would share that video again. As much as I hate to confess, it did break me. There have been many tactics which have been used by the ruling government in the last ten years to break me and that has fortunately never affected me. But this video broke me, it made me a lot more careful. It just instilled a sense of fear in me. I’m trying to get out of the fear slowly. It’s still there. It’s scarring.

Have you been able to determine who made this deep fake porn video?

No, we have not. I’ve not been able to determine this. In fact, when I went to lodge a complaint to the police station in New Delhi, they refused to file a complaint because they said, “So what? It’s just a porn video.” And I said, “No, it’s my image, and it’s being circulated.” It’s been one year and the cops have not investigated this. I have no idea who made that deepfake, but I do know that the people who are circulating that video were people from the ruling party of India.

Related: International lawmakers seek global regulations for social media

At the time, this was April 2018, did you have any awareness of what deepfakes were?

I had absolutely no idea. In fact, three days before this happened, the publisher of one of the biggest newspapers in India, The Times of India, wrote a tweet that a deepfake is being used in America and I wouldn’t be surprised if it’s used in India to discredit female voices. I read that tweet and I didn’t even know what that meant, so I did not even go read deeply into it. It was only if it happened to me today did I start reading about deepfakes, so much that I did not ever want to speak about deepfakes in my interviews, because it’s like a weapon. Many in India were not aware of it. I felt like if I spoke about it, I would give a weapon in the hands of people who might use it to discredit women that they had a grudge against.

That’s amazing. You didn’t even want to talk about this whole phenomenon for fear people would mimic it and create more of them?

Yeah, I was scared to talk about it.

How widely has that video been shared at this point?

There are days when I still do a critical story and I see a screenshot of that pop up on my timeline. The time when it was shared, I remember one of the leaders of the ruling party shared it on Twitter, and it was shared some 200,000 times. It was almost on every phone in India. It was on my father’s phone and my brother’s phone. It was being circulated on WhatsApp and Instagram. Each time I wrote something on Instagram and I posted a story somebody would post saying, “Hey, is that you? You were doing a good job in the video.” So, it continues to haunt me.

Did it affect your personal life? … Did you find yourself having to defend yourself to friends and family?

Not at all. I think that was the best thing to have happened to me. My family and my friends were there with me. They did not question me at all. The only person questioning me was my own self and the people who were trolling me on social media using that video. I kept on questioning myself time and again. Was I right in posting personal pictures on Instagram? Was I right in putting personal pictures on social media? But I was censoring my own self, and you do that. In times like this, you start questioning yourself as opposed to questioning those people who weaponize apps like these against women.

Related: Detecting ‘deepfake’ videos in the blink of an eye

Now, just today there was congressional testimony about deepfakes on the social and psychological impacts of them on their victims, and your name was brought up specifically. What do you think needs to happen to combat this kind of thing?

You know, I’ve been asked this question very often about how to combat something like this, something like technology or artificial intelligence. I really do not have any answer for this. What I do know for sure is that social media platforms like Instagram, Facebook and Twitter need to have some checks and balances so that these videos are not shared on their platforms. At that point in time when this video was shared, Facebook and Twitter looked the other way when I complained to them. They should have just removed the video and every sign of the video from all of their websites and from all their platforms. I had conversations with officials from Facebook and Twitter especially Twitter India. The problem was that they were not willing to concede that there was a problem on their part that their platform was being used to disseminate this video.

Editor’s note: We reached out to several social media companies for comment. Facebook responded via a spokesperson. “Misuse of intimate images — including those that are manufactured to depict a person — is not allowed on our platforms. We have removed the original video from Facebook and Instagram and have taken steps to keep it from appearing again. We will remove any screenshots we find that violate our policies.”

This interview has been edited and condensed for clarity. 

Invest in independent global news

The World is an independent newsroom. We’re not funded by billionaires; instead, we rely on readers and listeners like you. As a listener, you’re a crucial part of our team and our global community. Your support is vital to running our nonprofit newsroom, and we can’t do this work without you. Will you support The World with a gift today? Donations made between now and Dec. 31 will be matched 1:1. Thanks for investing in our work!