Fortalice CEO and former White House CIO Theresa Payton explains why disinformation is such a potent threat.
Dan Patterson, senior producer for CNET and CBS News, spoke with Theresa Payton, cybersecurity expert, CEO of Fortalice Solutions, and author of “Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth,” about political disinformation campaigns and why they’re important. The following is an edited transcript of their conversation.
Dan Patterson: Why did you choose to focus your attention on influence operations? What is so important about disinformation right now
SEE: Zero trust security: A cheat sheet (free PDF) (TechRepublic)
Theresa Payton: I consider influence operations to actually be the carbon monoxide poisoning of social discourse and democracies. It’s silent and it really truly is deadly and insidious. And it’s almost virtually unstoppable. We cannot un-tech our way out of it, so we’re not going to be able to solve it with technology solutions. We can’t legislate our way out of it. And it’s really going to be a three-prong approach. It’s going to be governments, big tech, and us. We have to come together to actually fight back.
There’s a belief in the security community and it hasn’t been completely proven out yet that at some point there actually may be more bots interacting on social media than there are actual human beings based on the number of accounts that are out there and sort of the behavior and activities behind those accounts. That’s something that’s currently underway from a research perspective. It happens on every platform and what’s interesting is I’ve tracked the evolution of these campaigns, starting first with email, starting first with fake websites. Now we’ve moved to fake personas on social media and as if that wasn’t enough, moving into smaller, more intimate, private messaging platforms, going into smaller closed groups. Just takes one invitation into that smaller group, suddenly it looks like they’re a trusted insider. They share the misinformation, and it goes viral. So, it is pervasive and it is truly taking place on every platform, from online gaming platforms, messaging and encrypted platforms, and social media platforms.
On the positive side, oftentimes a bot can be used to perform excellent customer service, help you answer a question, help you find a location, but everything that’s built for good can often be used for bad. What the bots are used for in these misinformation campaigns is to actually engage in authentic-seeming human-like behaviors to draw people in to an argument on both sides of the argument. And then once the bots get everybody into a frenzy, they move on to the next topic.
SEE: Cybersecurity: Let’s get tactical (free PDF) (TechRepublic)
Bots, the sock puppets, the fake personas, and even the fake organizations are alive and well and operating pretty much in sort of the open social media, not just in these private groups or encrypted chat platforms. What they’ve been doing, besides being very focused on election issues of where can you vote and how do you vote and what’s going on in the primaries, they have picked up on all of today’s headlines. Whether it’s COVID-19, whether it’s the movement right now with Black Lives Matter and police reform, whatever it is, that headline and those hashtags are of vital importance to these misinformation campaigns. And these fake personas, the bots, the sock puppets use those to seem as if there’s somebody in the know, your neighbor perhaps, somebody in your community, to again, draw you in and make you think that there’s actually a human behind it and that it is an organic movement.
Dan Patterson: What do they do? Give me an example of behavior I might see on say Twitter or Instagram, for example.
Theresa Payton: For example, conspiracy theories: There was trending at one point, and it was absolutely disgusting to see, with George Floyd there was trending with some people were real and some were fake personas interacting with them with a conspiracy that George Floyd wasn’t actually indeed deceased. And again, sort of the fake personas and bots who want to conduct the manipulation campaigns, leverage real people posting this conspiracy theory and just kind of whipped it up into a frenzy. That inauthentic behavior, making it seem like there’s a lot more people who believe in this, or a lot more people who are reposting this that are actually individual human beings is what is kind of cornerstone to many of these manipulation campaigns. And it’s despicable, but it’s unfortunately not surprising.
SEE: Exposing the dark web coronavirus scammers (TechRepublic)
The nation-states who conduct these campaigns, and so there’s multiple players, but as it relates to nation-states, they really want to destabilize democracy because one, they want to show their own citizens that they have it really good at home and you certainly don’t want to have what America has: “Look at that dumpster fire.” And yes, America is not perfect and America is hurting right now, but America is a democracy. We can actually vote in the changes we want. We can actually march in the streets and demand our elected officials represent us and represent change on behalf of all Americans. And you don’t have that in these other countries.
The second thing is, I went into the book research thinking, “Well, maybe it’s about picking winners or losers. Maybe they really do prefer one candidate over another.” And perhaps they do, but what I did learn in my research was they actually make a lot of money. The more you and I argue about an issue, they don’t really care about that. They want to destroy democracy. They want to destroy our ability to speak to each other on issues and to hear each other’s point of view. And then lastly, the more you and I argue, the more money they make through clickbait ads and other marketing schemes. And by the way, social media companies make a lot of money, too, when we argue.