Breaking News

How Homegrown Disinformation Could Disrupt This U.S. Election


Quick, think about disinformation. What comes to mind? “Vladimir Putin, president of Russia.” But in 2020, many experts are more concerned with disinformation coming from our very own backyard. Like this guy, who, with a single tweet, disrupted a governor’s race in Kentucky. “Oh I’m just a broke college student, basically.” “He had 19 followers. It’s slightly absurd. But it’s also slightly terrifying.” What makes misinformation truly dangerous is that it doesn’t need to hack into the actual infrastructure of an election. It only needs to hack the brains of voters. “A seed of doubt is sowed into the democratic process of elections. People just don’t trust the process anymore.” “The purpose is to confuse people, to cause chaos and to cause division. The hope with disinformation is that a country will kind of fall in on itself.” And the coronavirus pandemic has made things even worse. To understand how we got here, we have to go to a key battleground in this election, one that has no state boundary. The internet. Remember the internet in 2016? The year that gave us these? “Damn, Daniel.” “What steps will your energy policy take to meet our energy needs?” Well, it also gave us a flood of election disinformation created by a Russian troll factory, a.k.a. a Kremlin-linked company called the Internet Research Agency. “It was essentially a gray office building in St. Petersburg in Russia.” This is Claire Wardle. She’s a disinformation expert and educator. “People were paid to sit all day, pretending to be Americans, creating social posts and memes and videos, and pushing that out. They could just throw spaghetti at the wall. Many of the posts didn’t succeed, but other things really did.” Russians developed a simple, but effective playbook. “They basically inflamed existing American divisions. A lot of these accounts actually got in the hundreds of thousands of followers.” By the end of the 2016 election, Russian trolls could reach millions of Americans through their social media accounts. Crucially, what they managed to do was use online disinformation to organize dozens of real-life political rallies. Attendees had no idea they’d been set up by Russians. This was one of them, filmed by a Houston TV station. “I’m in downtown Houston right by the Islamic Da’hwa Center. There’s protests going on, on both sides of the street.” Russian trolls did all of this, not with particularly sophisticated spycraft, but with tools available to everyone. Pretty soon, their disinformation, spread with the intent to deceive, became misinformation, as real people unwittingly started engaging with the material. All the while, social media companies denied there was a problem. Speaking days after the 2016 election, Facebook C.E.O. Mark Zuckerburg struggled to articulate a defense. “I think the idea that fake news on Facebook — of which, you know, it’s a very small amount of the content — influenced the election in any way, I think is a pretty crazy idea.” In the years since, there has been a slow recognition. “We didn’t take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake. And I’m sorry.” “We found ourselves unprepared and ill-equipped for the immensity of the problems that we’ve acknowledged. And we take the full responsibility to fix it.” Some lessons were learned. “The companies have been a lot tougher on election misinformation, especially when they can tie it to foreign interference.” But those policies aren’t applied in the same way when the source of the misinformation is within U.S. borders. In certain cases, like with an unsubstantiated New York Post report, some platforms have taken drastic measures to restrict access, and face charges of censorship. But generally, the platforms try to avoid being seen as arbiters of truth. “When it comes to domestic and homegrown misinformation, social media companies still do err on the side of free speech.” So in the last four years, America’s election disinformation problem didn’t go away. It evolved. “Unfortunately, the landscape looks and feels very different now, because you’ve got all sorts of actors using the platforms in the ways that we learned the Russians did in 2016. And we see that playbook being used by political operatives in the U.S. And we see that same playbook being used by individuals in their basements who are angry and frustrated with life.” Sometimes it’s just one guy, sending one tweet from hundreds of miles away. That actually happened in 2019 in Kentucky. To tell this story, let’s first meet three people. The New York Times reporter who covered the Kentucky election. “My name is Nick Corasaniti.” The election administrator. “My name is Jared Dearing.” And the internet troll. “I am @Overlordkraken1.” We’re not showing his face, and only using his first name, because he says he’s afraid for his safety. On Nov. 5, 2019, Kentucky voters went to the polls to pick their next governor. “The race for governor in Kentucky in 2019 featured a very unpopular governor, Matt Bevin, who is a Republican.” “We’re just getting started.” “Facing off against Andy Beshear, the Democratic attorney general.” “We can’t take four more years.” “Every Democrat in the country was viewing the opportunity to deliver a blow to Mitch McConnell, and give him a Democratic governor as a real win. National money flooded this election.” “The day started well. I drove in around 4 a.m. Election Day is more like game day for me.” “I woke up, got ready for school, went to school.” “When the polls close at 6, the day’s not even halfway through at that point.” “I got on Twitter, and I saw the Kentucky election, what’s going on. And then I saw that the race was very close.” “It was neck and neck. They were maybe 1,000 votes here, 100 votes there, separating them.” “When an election is close, there’s a lot of pressure and stress that’s put onto the system.” “As soon as Republicans in the state started to see the possibility that they might lose the Statehouse, social media kind of erupted a little bit. People were looking for reasons as to how this could possibly be happening. How could a Democrat be winning in deeply red Kentucky? Emotions were high. It was kind of the perfect environment for any kind of disinformation or misinformation about the results to take hold.” “I decided that it would be a funny idea that if I made a fake tweet, spread it out to bigger accounts. I thought it was the perfect situation for it to go viral. I don’t remember how many followers I had, but I know it was less than 20.” “He had 19 followers.” “I set my geolocation to Louisville, Ky.” “He claimed he was from Louisville, but it was misspelled.” “It was just a typo. I’ve never been to Kentucky.” “And he sent out a simple tweet that said, ‘Just shredded a box of — ” “‘Republican mail-in ballots. Bye bye Bevin.’” “There’s so many checks and balances that we’ve built into the system over the past decades that we kind of know where all the ballots are at all times. So this is obviously a false claim.” “I’ve never seen a mail-in ballot.” “I probably never will know what their intentions were.” “All I really wanted to do was just get a few reactions out of some Boomers.” “Irresponsible. Frustrating. Damaging. Not helpful.” “I just thought it was funny.” “So Kentucky election officials found this tweet about an hour after polls closed, and they immediately notified Twitter.” And like that, the tweet was gone. But the story didn’t end there. It had actually just begun. “A few conservative accounts began screenshotting the tweet. And and when they screenshot that tweet and sent it around to their tens of thousands of followers, hundreds of thousands of followers, it was like a spark in a brushfire. And the tweet was everywhere.” “When we called Twitter to then take those screengrabs down, Twitter then said that it was commentary on the original tweet itself, and were unwilling to take the screengrabs down. So it’s a pretty big loophole, as far as I’m concerned.” “Election security officials kind of refer to these networks of accounts as a Trump core. And what they do is they wait until there is a debate, or a discussion, or a controversy, and they will immediately go to the conservative side and amplify it.” Throughout the evening, a single atom of disinformation opened the door for more stories that muddied the waters in an already close election. “While this was happening, it was now reaching a pretty broad narrative. It wasn’t only restricted to the conservative internet. There were normal voters who were seeing this, there was news outlets who were seeing this.” At the end of the night, Matt Bevin, who was trailing behind his opponent by just 5,000 votes, contested the results. “There have been more than a few irregularities.” “He didn’t offer any evidence. He didn’t say what those irregularities were. But it was because of those irregularities that he requested a re-canvass of all of the vote.” Bevin never specifically mentioned the tweet, but it was one of the most viral pieces of disinformation raising doubts about the election. “Bevin basically refused to concede, and left the election in question.” “My intention was never for it to get as big as it did. But I guess it was a lot easier than I thought.” For the next few days, talks of election fraud hurting Bevin kept going. “There was a time in the middle there, where there was a lot of squoosh. Both sides had the opportunity to create their own narrative. And unfortunately, part of that narrative was being driven by misinformation.” Bevin’s supporters staged a press conference, alleging fraud. But again, offered no evidence. “Are you really under the belief that hackers couldn’t hack our votes that are uploaded to a cloud?” “There is no cloud involved in the election tabulations in Kentucky.” Eventually, after re-canvassing of the results concluded nine days later, Bevin conceded the race. “We’re going to have a change in the governorship, based on the vote of the people.” Andy Beshear is now the governor of Kentucky. But it’s hard to remove the various claims casting doubt on the election, once they’re out there. Videos alleging fraud in Kentucky’s governor’s race are still gaining more views and comments. Fast forward to 2020. “I don’t think the question of misinformation is whether it’s going to happen. It will happen.” Election officials across the country are gearing up for a difficult fight against disinformation ahead of the election. Like in Michigan. “We anticipate challenges coming from multiple different angles. Whether they come from the White House, whether they come from foreign entities, whether they come from social media voices.” And Colorado. “We really need federal leadership. There’s bills just sitting in the House and in the Senate that are never going to get heard, never going to get their chance. And meanwhile, our democracy is under attack.” After countless investigations, hearings and public grillings of social media executives over the past four years, the U.S. is still ill-equipped to deal with the problem. “I feel like the analogy here is someone taking a bucket of water and throwing it in the ocean.” Election officials are competing on social media against people with larger followings, like President Donald Trump himself. “President Trump has used his Twitter account and his Facebook account to spread falsehoods about voting.” In 2020, President Trump has tweeted election misinformation or claims about rigged elections about 120 times. Twitter has put warnings on some of President Trump’s tweets and Facebook has added labels that direct people to accurate election information. “There really isn’t a uniform policy that they apply evenly across the different social media companies.” “It’s pretty depressing to sit where we sit right now, heading into this election. We have failed to do enough to secure the election in a way that we needed to.” On top of that, the Covid-19 pandemic is making the misinformation problem even worse. For example, the pandemic has forced many states to expand vote-by-mail on a large scale for the first time. And that’s resulted in a surge in false or misleading claims about mail-in voting, according to media insights company Zignal Labs. Of the 13.4 million mentions of vote-by-mail between January and September, nearly one-quarter were likely misinformation. The pandemic has led to another important shift, as different conspiracy communities are emerging and working together. Here’s a look at how domestic misinformation gained more reach on Facebook during a single month this summer. These are groups that are prone to share misinformation about the election. These are anti-mask groups that tend to share content like this. Then there are the QAnon groups, a pro-Trump conspiracy group that promotes, among other things, the false idea that America is controlled by a cabal of globalist pedophiles. Facebook says all QAnon on accounts would be banned on its platforms. But what we found is these seemingly disparate conspiracy groups are increasingly connected by crossposting the same content, forming — “A huge tent conspiracy.” For example, this piece of disinformation, claiming that Barack Obama created antifa, was shared in all three types of communities. “A lot of people who will believe that the coronavirus is a hoax will also believe that the elections process is not to be trusted.” “The theme here is that more and more Americans feel like they cannot trust institutions.” And that could have serious consequences around Election Day. “What that does is that will create a big uncertainty, and allow any bad actors to spread more disinformation in an already charged electorate. It will also give people the opportunity to say they’ve rigged an election, when it’s so much harder to actually rig an election.” Social media companies are preparing for the scenario that President Trump, or other candidates, will falsely declare victory. Or worse, where the losing candidate refuses to concede, and claims election fraud. The 2019 Kentucky election avoided that, but the 2020 presidential election may not. “If we were to insert President Trump and months of undermining the electoral process into the Kentucky election, there probably would have been even more users who believed @Overlordkraken1’s tweet that he shredded ballots. It could have gone from thousands to millions.” “Will you pledge tonight that you will not declare victory until the election has been independently certified?” “I hope it’s going to be a fair election. If it’s a fair election, I am 100 percent on board. But if I see tens of thousands of ballots being manipulated, I can’t go along with that.” “It’s something we’ve never seen before, and it sets a runway for the kind of disinformation that has disrupted other elections to really take off at a level we’ve never seen.” “I’m Isabelle Niu, one of the producers of this episode. There’s a lot going on in this election, and we want to make sure we take a deep dive into the major issues. Check out the other episodes of Stressed Election. We cover voting rights, voting technology and vote-by-mail.”



Shared From Source link Breaking News

Leave a Reply

Your email address will not be published. Required fields are marked *