Millions of Facebook users likely had their privacy violated when the company’s lax controls allowed user data to be sold to Cambridge Analytica, which allegedly used the information to manipulate voters. As Facebook faces declining consumer trust and an eroding stock price, its executives promise changes. Will they be enough to keep our information safe?
Facebook CEO Mark Zuckerberg will be grilled this week by members of Congress about his company’s failure to ensure the privacy of Facebook users. While experts debate the details, people who log onto Facebook want to know what can be done to keep their personal information secure.
The purloined data that Zuckerberg will be quizzed about ended up in the hands of Cambridge Analytica, a political consulting firm bankrolled by supporters of President Donald Trump and masterminded by former White House strategist Steve Bannon.
A former Cambridge Analytica employee-turned-whistleblower has warned that the data may have been used to to target and manipulate voters in the 2016 US election.
Facebook recently upped its original estimate of users whose privacy had been violated in this one case to 87 million. This includes not only users who clicked on the personality-test app developed by Cambridge University academic researcher Aleksandr Kogan, but also friends of users — even those who never touched the app. (Kogan has claimed he did nothing wrong, and Cambridge Analytica denies using the Facebook data.)
Even worse, Facebook has acknowledged that its granting of liberal access to other app developers may have further compromised Facebook users’ information.
Facebook now is under investigation by the Federal Trade Commission (FTC), which is assessing whether the social media company’s practices deceived consumers and caused them harm.
Many expect the Congressional hearings to shed more light on the controversy and on how to ensure that such a breach of privacy never happens again. But given the fact that many members of Congress lack even a rudimentary understanding of the online world, we present this backgrounder, based on the comments of online privacy experts and reformers who discussed this issue last week at the DC think tank, New America.
How Could This Have Happened?
Kevin Bankston, director of New America’s Open Technology Institute, traced the trouble back to 2010, when Facebook, aspiring to “grow even faster,” essentially opened its doors to app developers. The developers could “access and use data from Facebook users who were using the apps,” Bankston explained. “But there was a big privacy catch. Not only could app developers obtain data from their users but also from all the friends of those users.” (Facebook ultimately took some steps to strengthen consumer privacy, but not for several years.)
Most users were in the dark about this change, unless they read the fine print of Facebook privacy policies or were able to access “a not particularly easy-to-find privacy setting for adjusting what data your friends could share about you,” Bankston said.
In 2014, Kogan’s survey app — This Is Your Digital Life — attracted 270,000 Facebook users, Bankston said. And apparently, Bankston speculated, Kogan had no trouble ignoring Facebook’s terms of service and providing that data — which grew exponentially when it included Facebook friends — to a “spooky political consulting company that wanted to build psychographic profiles of voters in order to better manipulate them.”
When Did Facebook Know About the Breach?
By late 2015, Facebook knew that Cambridge Analytica had users’ data. It deleted Kogan’s app. But Facebook “did little to confirm that the misappropriated data had been deleted, other than to [ask Cambridge] to certify that it had done so,” Bankston charged, adding that Facebook continued to “allow Cambridge Analytica to advertise on its platform” until March 2018, just before the controversy made the headlines.
The FTC, worried about the very abuses that Facebook now has apologized for, raised concerns about Facebook’s practices nearly a decade ago, said David Vladeck, a former FTC official and Georgetown University law professor.
“I’ve seen this movie before,” he stated. “The FTC’s consent decree [with Facebook] was designed to avoid exactly the Cambridge Analytica problem.”
The FTC charged that Facebook had deceived consumers when it promised to not share their information without their consent. “FTC said that part of the deception was allowing third parties to get access to how people exercise their political views without their consent,” Vladeck added.
But Vladeck argued that the company “has violated many provisions” of the FTC’s order.
Vladeck said that the consent decree Facebook agreed to in 2011 also attempted to “rein in” the collection of information by third parties. “If you look at the consent decree it draws a bright line between users, who are people who actually post things, and third parties who actually harvest” information about users, he said. “And the goal was to limit third party access unless there’s clear notice and clear consent.”
Facebook executives have claimed that they complied with the FTC’s order because they gave consumers the option to change their privacy settings to prevent the sharing of their personal information.
But Vladeck said he doubted that many Facebook users understood that giving permission to share their information with “friends of friends” would place it in the hands of Cambridge Analytica.
The FTC also required Facebook “to look at vulnerabilities where consumer privacy is at jeopardy, and plug those holes,” Vladeck said. “Facebook didn’t pay any attention to that,” he charged.
Vladeck said that if the FTC imposes a fine on Facebook, it could be “astronomical” since the agency considers one violation to be harming one consumer. In theory that could total $40,000 (the penalty for one violation) times 87 million consumers, he said. But Vladeck acknowledged that this would not be the number the FTC would actually use in negotiations with the company.
Are Facebook’s Changes Sufficient?
Facebook has announced a lot of reforms to build back consumer confidence. The company promises to better monitor third-party app developers and to give users more information about third-party apps and websites. It also has pledged to make it easier for consumers to access, understand, and use the site’s privacy settings. And it has promised much more transparency for all the political and issue ads on its platform.
Experts welcomed the changes but called for more reforms to ensure that Facebook and other companies keep the promises they make and better serve the public.
- Exert more oversight over third-party app developers. Vladeck suggests that Facebook should should audit developers, so that the company could better monitor them, with clear consequences if they break the rules.
- Make it far easier for researchers to evaluate the company, to assess, for example, how certain ads are targeted to certain demographics — said Harlan Yu, executive director of Upturn, a DC-based nonprofit focused on technology and civil rights. More transparency and accessibility would help identify advertisers that discriminate against people of color or prey on vulnerable consumers, Yu said.
- Strengthen the FTC.
The 104-year-old agency “is doing an incredible job with antiquated authority,” said FTC Commissioner Terrell McSweeny. “It is using its authority to protect consumers from unfair deceptive acts and practices,” she said. But that limited authority makes it difficult for the agency to protect consumer privacy as well.
“It is not strong enough as currently configured with its current authorities and resources” to offer consumers the protection they need “at a moment when we are connecting every part of our lives to the internet and to each other.”
Additional funding would also allow the agency to increase its technical staff and hire outside experts “to evaluate what it’s being told” by companies, she added.
- Hold CEOs feet to the fire. One reform suggestion: Congress should require CEOs to quarterly certify that their companies are keeping their promises to their customers.
- Rethink the concept of consent.
Experts stressed that consent does not mean much if a consumer does not understand what the consent implies.
“You don’t see the hundreds of eyes that are looking at you when you post something on Facebook, and all of this is by design,” charged Michelle De Mooy, director of privacy and data at the Center for Democracy and Technology. While conceding it would be difficult to achieve, she said Congress ought to pass a law requiring that internet platforms be designed to ensure that consumers actually know how their information is being shared. This would allow a company to be held accountable when consumers’ privacy is violated.
“Yelling at Mark Zuckerberg is a start,” she added. “But it isn’t necessarily going to make change.”