Information Warfare Has Evolved, Democracy’s Defenses Have Not - WhoWhatWhy Information Warfare Has Evolved, Democracy’s Defenses Have Not - WhoWhatWhy

Culture

Mark Warner, Michael Chertoff, Clement Wolf, Ellen Weintraub
Left row top to bottom: Sen. Mark Warner (D-VA), former Secretary of Homeland Security Michael Chertoff, and Google's Global Public Policy Lead for Information Integrity Clement Wolf. FEC Chairperson Ellen L. Weintraub (right). Photo credit: Mark Warner / Flickr, Chatham House / Wikimedia (CC BY 2.0), FECTube: FECConnect OnDemand.

As lawmakers scramble to curb information warfare, the FEC held an unusual symposium with big tech and experts to discuss the threats to democracy online.

Protecting Out Vote 2020

Quick: How many of each type of animal did Moses bring on the ark?

Your response might well be “two.” But if you slowed down a moment, you would realize that it was a trick question. Of course, it was not Moses at all in the Bible’s Genesis flood narrative. It was Noah.

Posing that question to a crowd of nearly 200 academics, advocates, and journalists, Lisa Fazio — an assistant professor in psychology and human development at Vanderbilt University — showed how the slightest misinformation can influence many in a matter of seconds.

National security experts warn that the increase in misleading information online poses numerous threats to the public’s faith in US elections. On the 232nd anniversary of the signing of the US Constitution, lawmakers and election integrity experts gathered in Washington Tuesday to grapple with the threats it poses to the 2020 election, redistricting efforts in 2021, and the daily lives of Americans in the digital age.

Members of Congress, former heads of intelligence agencies, and big tech leaders all gathered at a symposium dedicated to finding ways to stop the burgeoning threat to democratic norms. The event was hosted by FEC Chairwoman Ellen Weintraub, along with PEN America and the Global Digital Policy Incubator (GDPI) at Stanford’s Cyber Policy Center.

There can be little doubt about what prompted the symposium: The 2016 election put a spotlight on social media companies and their questionable practices with users’ data and privacy. Although Facebook and Twitter users have grown familiar with bots and misleading information, Russia’s Internet Research Agency (IRA) widely used Instagram to target and suppress the votes of key demographic groups  supporting Democrats in 2016.

Voter suppression tactics have advanced to a new front — online and on social media. A growing concern is that Americans could also inadvertently give life to false information online. If no action is taken to ensure transparency on platforms like Instagram, which is owned by Facebook, hostile foreign actors like Russia, China, North Korea, and Iran could interfere in next year’s elections.

The symposium brought together leading figures in big tech and politics to bridge the gap between policymakers and the experts. Eileen Donahoe, executive director of Stanford’s GDPI, said that preventing the spread of disinformation must also include media literacy and public awareness.

social media, advertising, disinformation
Photo credit: Adapted by WhoWhatWhy from Mike Corbett / Flickr (CC BY 2.0) .

Such an event run by the head of America’s federal election watchdog is unusual for the agency, but the persistent threat of disinformation continues to play a major role in daily lives.

“This is not our usual Tuesday,” Weintraub said, prompting laughs.

With the FEC in its third week without a quorum, Weintraub said during her opening remarks that the agency will continue to process federal election complaints and oversee campaign finance violations.

”Disinformation is a fundamental assault on our democracy and the ‘united’ character of the United States,” Weintraub said.

Although the FEC lacks a quorum to vote on enforcement measures, members of Congress continue to put pressure on Senate Majority Leader Mitch McConnell (R-KY) to take up the roughly 40 election security measures pending in the Senate.

Sen. Mark Warner (D-VA), vice chairman of the Senate Intelligence Committee, gave the keynote address and emphasized his dissatisfaction with Congress’s inability to act.

“This may come as a shock, but Congress doesn’t have its act together,” Warner said.

When asked about combating disinformation pushed by political figures, Warner paused as to suggest that the question referred to President Donald Trump’s tendency to tweet first and verify later, prompting a few laughs.

Warner, who co-sponsored the Honest Ads Act of 2019 — which would require platforms like Facebook to list political ad buyers that spend more than $500 in total — also warned that the misuse of technology is a threat to democratic principles.

Mark Warner
Sen. Mark Warner (D-VA) speaking before the “Digital Disinformation and the Threat to Democracy” symposium on September 17, 2019. Photo credit: FECTube: FECConnect OnDemand.

“We are one significant event away from Congress potentially overreacting,” Warner said.

In the run-up to the 2018 midterm elections, Facebook created a library of political ads. And in late August of this year, Facebook announced updates to its social and political advertising disclosures that aim to address concerns critics have raised. But lawmakers like Warner have pushed for more action.

“We can start with greater transparency,” Warner said.

The symposium, moderated by Suzanne Nossel, CEO of the writers group PEN America, ended with a panel of journalists and Silicon Valley representatives discussing risks to the 2020 election process. Although invited to speak as panelists, representatives from Facebook or Twitter — both of which faced severe criticism over their roles in disseminating disinformation during the last election — chose instead to sit in the audience.

Toward the end of the event, an audience member asked why. Kevin Kane, public policy manager with Twitter, stood up and grabbed a mic.

“I want to hear and learn from the experts in the field, and get their thoughts and comments so we can go back and continue to work to improve,” Kane responded.

Unsatisfied, another audience member said that they wanted to hear directly from Twitter. Then, one more took the mic and asked what actions Twitter would take if Trump tweeted disinformation to his 64 million followers.

“Honestly, I’m not going to engage in hypotheticals like that right now,” Kane said.

Social media, beholden as they are to shareholders, arguably have less incentive to self-regulate content and political advertisements.

Facebook Is Now the Only Cop in Town to Protect Elections

FEC Blocks Investigation Into Russia-NRA Ties

Another concern in Washington is the use of “deep fakes,” or videos that are manipulated to look and/or sound real — such as the recent video of House Speaker Nancy Pelosi (D-CA) speaking at an event that was slowed down to falsely make her appear intoxicated.

Rep. Stephanie Murphy (D-FL) has been a leading voice on combating deep fakes. During the symposium, Murphy argued that, as technology evolves, so do the methods and types of attacks by malicious foreign actors.

Murphy also pointed to the IRA’s success in accessing  voter registration rolls in at least two Florida counties. Although no voter’s information was altered, she said, Russia was still able to put itself in a position that it never should have been in.

Murphy’s concern is that if a hostile foreign actor were to penetrate voter registration databases again, the public would be the last to know. Once the FBI informs election officials, the decision to make that information available is at their discretion — and they often opt not to disclose.


Related front page panorama photo credit: Adapted by WhoWhatWhy from FEC / Wikimedia, Jason Samfield / Flickr (CC BY-NC-SA 2.0), Google / Wikimedia, Twitter / Wikimedia, Facebook / Wikimedia, and Instagram / Wikimedia.

Author

Comments are closed.