© 2024 Connecticut Public

FCC Public Inspection Files:
WEDH · WEDN · WEDW · WEDY
WECS · WEDW-FM · WNPR · WPKT · WRLI-FM · WVOF
Public Files Contact · ATSC 3.0 FAQ
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Troll Watch: Deepfakes And 2020

MICHEL MARTIN, HOST:

We know that our last presidential election in 2016 was the target of trolls - both domestic and foreign players who set out to sway and divide voters with lies and distortions spread through social media. This week, a new report outlines the threats we should be watching out for in 2020 and where they're coming from.

PAUL BARRETT: To begin with, I think the Russians will be back. But joining them this time around, I fear, may be the Iranians, who have already been here and started to spread disinformation on social media - and perhaps the Chinese as well.

MARTIN: That's the report's author, Paul Barrett of the Center for Business and Human Rights at NYU's Stern School for Business (ph). He warns that the threat from domestic disinformation is also growing.

BARRETT: The volume of material coming from within the United States is undoubtedly greater than that that comes from abroad. In other words, we're doing this to ourselves.

MARTIN: But what exactly should we be watching for as the 2020 election approaches? That's what we want to talk about in our regular Troll Watch segment.

(SOUNDBITE OF MUSIC)

MARTIN: Back in 2016, it was Facebook and Twitter bots that were the main sources of election-related disinformation. Now, though, Paul Barrett says, there are other platforms to worry about.

BARRETT: Well, we identified Instagram as being in particular probably the one to watch the most closely. Disinformation is increasingly accomplished by means of images as opposed to text, and Instagram, of course, specializes in images. Facebook, which is Instagram's parent, has not done as much with Instagram in terms of protecting users as it has with the main Facebook platform, and that's another reason to worry about Instagram.

In addition, WhatsApp, which is a different kind of platform - a messaging platform as opposed to a public posting platform - was instrumental in disinformation being spread in the presidential elections in Brazil and in India. And for that reason, I think it bears watching here as well.

MARTIN: So I want to talk a little bit more about how this is actually done. And you mentioned deepfake technology as a major threat. And these are, you know, fake videos or audio that can be manipulated to make somebody do or say something they actually did not do or say. And I just want to play a clip - this is something that comedian, director and producer Jordan Peele made last year with Buzzfeed to warn about the dangers of deepfakes and disinformation. And this - his voice impersonation is superimposed onto the video image of former President Barack Obama.

(SOUNDBITE OF ARCHIVED RECORDING)

JORDAN PEELE: We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time, even if they would never say those things. For instance, they could have me say things like Killmonger was right.

MARTIN: Killmonger, of course, being a reference to the antagonist in "Black Panther" - the hit, "Black Panther." Talk about why this is such a great threat now.

BARRETT: Yeah. It's a great threat now because disinformation is increasingly an image game as opposed to just a text game. That's the first step. The second step is technological. The artificial intelligence that's used to make deepfake videos has been advancing steadily and is now readily available in open-source form so that anyone with any - you know, basic coding talent and a laptop and the desire to mess around with elections can begin to cobble together these very convincing but fake videos. And the companies are aware of this and are scrambling, perhaps belatedly, to try to respond to it.

MARTIN: So I'm going to ask you just in the time we have left to just spread the responsibility around. Is there - are there things that social media companies should be doing right now to protect the public from misleading information? Are there things that regulators should be doing? And are there things that citizens should be doing?

BARRETT: Well, let's start with citizens first. People have to be very skeptical about what they look at, how they react to it and whether they want to share it. I think there are things for Congress to do, but they're generally fairly narrow things. There's legislation called the Honest Ads Act that's pending that could vastly increase the amount of disclosure that goes with political advertising online. I think that would be a very good law to be passed. There's also legislation pending that would more severely punish voter suppression disinformation, which is one of the main and most pernicious forms of disinformation.

As for the companies, I think one thing they could do is act more aggressively against provably false information, which when it's identified at present, is generally just demoted or D-ranked, which means that it's distributed to fewer people and possibly labeled or annotated. I would argue that they ought to just take that kind of material off their sites altogether.

MARTIN: That's Paul Barrett with the NYU Stern School for Business (ph). His report outlines the disinformation threats to the 2020 presidential election.

Paul Barrett, thanks so much for joining us.

BARRETT: My pleasure. Thank you. Transcript provided by NPR, Copyright NPR.

Stand up for civility

This news story is funded in large part by Connecticut Public’s Members — listeners, viewers, and readers like you who value fact-based journalism and trustworthy information.

We hope their support inspires you to donate so that we can continue telling stories that inform, educate, and inspire you and your neighbors. As a community-supported public media service, Connecticut Public has relied on donor support for more than 50 years.

Your donation today will allow us to continue this work on your behalf. Give today at any amount and join the 50,000 members who are building a better—and more civil—Connecticut to live, work, and play.