Subscribe

1984, George Orwell, Book Covers
Three covers printed for George Orwell's dystopian novel "1984." Photo credit: DNAInfo (Public Domain) and

A bunch of silly videos on TikTok might seem harmless, but don’t count on it. A leading social media platform in foreign hands can be dangerous.

Listen To This Story
Voiced by Amazon Polly

On the surface, it’s hard to think of TikTok as a serious threat to national security. The site’s often silly offerings look about as offensive as a Barbie doll or your average teddy bear. But looks can be deceiving and the potential threat posed by TikTok is both immediate and real.

Until now, Congress has focused primarily on the TikTok app’s ability to collect intelligence in the form of intimate data concerning the American public. A greater, longer-reaching threat is the app’s potential to subtly manipulate the way we understand everyday reality.

In the futurist novel 1984, George Orwell’s protagonist, Winston Smith, fights desperately to resist the subtle, mind-numbing control exerted over just about everyone by Big Brother — the enigmatic head of a bland, totalitarian system that rules over a generally submissive society. Big Brother may or may not be a real person. No one really knows. Smith is employed in a government office that continuously rewrites contemporary history, thus guaranteeing that Big Brother’s pronouncements are always seen as being correct. Constant video surveillance protects the state against any deviation. Those who protest are simply “disappeared.”

The introduction of a government-sponsored language, Newspeak, is intended to short-circuit critical thinking. As Orwell explains, it was “designed to reduce the range of thought.” Anyone tempted to criticize the system would find that he lacked the words to express new ideas.  

In the end, only one thing acts in Winston Smith’s favor. “With all their cleverness,” he reasons, “they had never mastered the secret of finding out what another human being was thinking…” TikTok promises to change that.

Orwell wrote the novel in 1948, and he simply reversed the last two digits of the year when he was finishing it. Written just three years after the end of World War II, Orwell could already see the early signs of government efforts at mind control in the propaganda and psyops efforts that both sides had made during the war.  

Despite Orwell’s warning, those efforts accelerated in the decades following the war. From the “Mad men” of New York’s Madison Avenue advertising agencies in the 1960s to motivational and behavioral research, the Holy Grail has been an ongoing effort to figure out what people think and why. 

According to a Pew survey, roughly 22 percent of TikTok users polled in 2020 said that they were getting most of their news from TikTok. Only a year later, 43 percent said they were getting their news from TikTok. 

The ultimate goal is to better understand how to influence and control each of us so that we will do something that we might not decide to do on our own. The major difference between the time when Orwell was writing and now is the internet. Social media today not only gives influencers extraordinarily detailed information about anyone who is online, it also lets messaging be tailored to the sensitivities and weaknesses of each individual member of its target audience. That audience is essentially us.  

Advertising is no longer limited to a local or regional market. With the internet, it has gone global. Instead of broadcasting, which means sending a one-size-fits all message to a random group of people who may or may not agree with it, the trend these days is towards narrowcasting, which involves using extraordinarily detailed information to craft messages that are guaranteed to resonate with specific groups of people. The trick is to identify who those people are. Vastly increased computing power, combined with large language mModels, like ChatGPT-4, and artificial intelligence makes it possible to target individuals to a degree that never existed before.

The beauty of TikTok from China’s point of view is that most of the raw creativity that makes TikTok so addictive comes from its users, who are mostly American. How that raw creativity is actually used, however, is determined by the platform’s algorithms, which thanks to AI can shape the final product to produce subtle effects that may not be immediately obvious. 

An example of the destructive potential of uncontrolled social media was provided when Facebook’s news feed used engagement-based algorithmic systems to blanket the web with inflammatory pictures supercharging hatred against the Rohingya minority in Myanmar (Burma) in 2017. According to Amnesty International, Facebook’s algorithms created a firestorm reinforcing a government ethnic cleansing campaign that led to thousands of Rohingya being raped, killed and displaced. Although Facebook was warned about the violence that was taking place, it initially did nothing to stop its algorithms. 

In 2023, British journalist Cristina Criddle, who writes on technology for the Financial Times, opened a TikTok account devoted to videos of her black-and-white cat, Buffy.  TikTok eventually admitted that it had tried to match the Internet Protocol identification number of her computer to those of its own staffers in order to discover which staffers had leaked company information to the press. Buffy, previously a regular TikTok performer, has since gone off the Net. The New York Times reported that TikTok staffers were routinely reporting sensitive information concerning TikTok users in emails and on their internal messaging system. Much of that information could be easily accessed by TikTok staffers in China.  

It is important to note that the bill debated in Congress does not call for shutting TikTok down; instead, it demands that TikTok’s Chinese owners sell the platform within six months. The Chinese government has already indicated that it may try to block any sale. It has the US caught in a honey trap enforced by TikTok’s popularity.  

To be fair, the Chinese and TikTok arrived on the scene fairly late in the game. The crucial breakthrough came in the early 1990s, when a network software engineer, Lou Montulli, invented the internet “cookie,” essentially a bit of software code that is stored in your computer every time you access something on the internet. Cookies leave a trail of digital breadcrumbs that can track just about everything you do on the internet. 

In 1995, an advertising company, Double Click, developed a system that used cookies to automatically profile the public so that advertisers could target their pitches at the customers most likely to be interested in a specific product. If you looked on the internet for vacation spots in the Bahamas, you were suddenly overwhelmed with advertisements for cheap hotels in the Caribbean. 

While working on a story about Double Click in 1995, I called Montulli to talk about what the cookie hath wrought. “If I had known what would happen,” he told me, “I don’t think I would have invented the cookie.” Double Click was soon bought by Google and ultimately folded into its advertising sales apparatus. Advertising was never the same afterward.

Mark Zuckerberg pushed the concept further with Facebook. His goal was to attract as many users as possible in order to sell ads. By collecting data on individual users, Zuckerberg attempted to identify the specific likes and dislikes of each user and then funnel matching ads in their direction. By trial and error, it soon became apparent that a heated debate was an effective way to get users to stay online. Allowing users to make outrageous statements that resulted in online shouting matches actually encouraged more users to expose themselves to more ads. If that contributed to polarizing the public and possibly damaging society, well, that was someone else’s problem. Zuckerberg was in it for the money, not the welfare of society. 

In addition to ad sales, it soon became apparent that the data Facebook produced provided an extraordinarily granular picture of where most users positioned themselves on the political spectrum. In 2015, Cambridge Analytica hired a computer scientist, Aleksandr Kogan, to develop a digital app called “This is Your Digital Life.” Kogan, who was born in Moldova, in the former Soviet Union, and educated at Hong Kong University, had become an American citizen and continued his studies at Stanford University. 

Kogan paid several hundred thousand Facebook users a nominal fee to use their data in what he claimed was a research project solely intended for academic use. What no one realized at the time was that thanks to Facebook’s structure, Kogan also obtained access to the “friends” of those unsuspecting users who had authorized use of their data.

Steve Bannon, Donald Trump’s key strategist in the lead up to the 2016 election, had helped set up Cambridge Analytica three years earlier, using funds provided by right-wing computer magnate Robert Mercer. According to Christopher Wylie, who ran most of Cambridge Analytica’s operations and described them in detail in his 2019 book, Mindf*ck, Bannon had realized early on that the 2016 election would very likely depend on a hundred thousand or more independent voters in key swing states. If white supremacists, neo-nazis, home-grown militia organizations, and other previously ignored minority groups could be mobilized, their votes might just be enough to swing the election in Trump’s favor. 

Social media turned out to be the perfect vehicle to reach the previously disenfranchised and convince them to add their votes to the Trump campaign. The strategy worked. In 2016, Trump lost the popular vote by nearly 3 million votes but won the Electoral College by 77 electoral votes and was declared president.  

Cambridge Analytica’s meddling in US politics was one thing. Giving that kind of leverage to China, which has increasingly expressed hostile intentions against the United States, especially in disputes over Taiwan and naval traffic in the South China Sea, is a different matter altogether.

Allowing Beijing access to vital financial and personal information on roughly 170 million Americans is unnerving enough, but letting Bytedance, TikTok’s owner, operate a platform that enables it to frame and manipulate communications among more than half the US population could in the long run prove to be even more dangerous. 

Bytedance argues that until now it has never had any undue influence from the Chinese government, even though Beijing is a major shareholder. That may be true, but there is no guarantee that Beijing won’t intervene in the future. A shotgun is harmless until someone decides to pull the trigger.

When it comes to its own territory, China is hardly unaware of the dangers posed by unrestrained social media and free-wheeling enterprises like Bytedance and TikTok.  Beijing not only erected its famous “great firewall,” to seal itself off from non-Chinese internet providers; it also steadfastly refuses to allow any foreign-owned social media to operate in the Chinese market. 

Its draconian repression of free expression in Hong Kong, its enslavement of the Uyghur minority, and its continuing attempts to intimidate Taiwan, not to mention its aggressive efforts to impose its control over the South China Sea, leave little doubt concerning where it really stands. China is anything but a cuddly teddy bear. 

The US may believe in the principle of free speech, although there are disputes over its limits when it comes to disinformation or inflammatory remarks. Beijing, as it has shown in Hong Kong and elsewhere, has no qualms over censorship. 

While TikTok broadcasts a tantalizing mix of silliness in America that is largely targeted at a younger generation more than happy to make jokes in front of a camera, Douyin, the version of TikTok that operates in China, is anything but frivolous. The Chinese version stresses that Chinese youth need to study science and technology and it promotes core virtues favored by China’s Communist Party. The message is clear: If Americans really want to amuse themselves to death, China is more than willing to oblige.

Please donate to support WhoWhatWhy 

While Bytedance sticks close to the party line in China, TikTok offers the company an amazing amount of leverage when it comes to manipulating what Americans think about the world. According to a Pew survey, roughly 22 percent of TikTok users polled in 2020 said that they were getting most of their news from TikTok. Only a year later, according to Pew, 43 percent said they were getting their news from TikTok. Statista, another polling group that keeps tabs on social media, reports that 1 out of 3 Americans between the age of 18 and 29 currently gets their news from TikTok. A surprising number of TikTok users find the app addictive as well as amusing. 

In all of this, it is worth remembering that despite a few cosmetic gestures towards modernism, China remains very much a totalitarian state. Throughout most of the latter half of the 20th century, both China and Russia experimented with “brainwashing” opponents and political dissidents, basically anyone who disagreed with the Communist Party line. The Russians usually relied on crude torture in order to force dissidents to confess to “crimes against the state.” In contrast, the Chinese sought to reengineer the thought processes as well as the sense of identity of their victims. The Chinese wanted victims to genuinely believe that the state had been right all along. 

What resulted was a blend of Orwell’s 1984 enforced by a number of brutal techniques recalling the Spanish Inquisition. During the Korean War, the process involved breaking down a prisoner’s sense of identity and then offering an alternative that not only promised an end to the intolerably harsh treatment but also seemed logically reasonable. At the end of the Korean War, at least 21 American POWs decided that they would rather live in China than return to the US.

The Korean War was a long time ago, but the Chinese state’s interest in reshaping human consciousness has remained constant ever since. It might not be possible to torture an entire population, but a few terrorist attacks, a bout of inflation, and a few strategically placed fears can lead to a general disorientation that opens the way for a determined influencer to change the way people think. 

The threat, however, is not limited to TikTok. What happens if a powerful US-based social media platform —, Elon Musk’s X (formerly Twitter) or Zuckerberg’s Facebook, or Google — decides to shape our reality for its own potentially destructive purposes? 

Large language models and artificial intelligence are redefining what is possible, especially as more and more of our lives are conducted over the internet. We live, to a greater extent than ever before, in a virtual reality conveyed to us by smartphones, tablet screens, and laptops, and very likely in the future by virtual reality goggles and headsets. It’s a good deal easier to tamper with that reality than the physical reality that surrounds us. Taking all that into account, it is not hard to understand why Congress voted nearly unanimously to insist that TikTok cut itself loose from any future manipulation by a potentially hostile Chinese Communist Party. 

The threat, however, is not limited to TikTok. What happens if a powerful US-based social media platform — Elon Musk’s X (formerly Twitter) or Zuckerberg’s Facetbook, or Google — decides to shape our reality for its own potentially destructive purposes? Business naturally disdains government regulation, but what happens when business gets access to tools that enable a few CEOs to redefine society? The damage at home could be just as great as anything from a malevolent TikTok teleguided by China. 

The US clearly needs social media guidelines that go further than simply blocking TikTok. How that can be accomplished with a government that is already so polarized that it can just barely vote to fund itself is anyone’s guess. But an answer needs to be found fast, or Orwell’s vision of a deadened population governed by a Big Brother no longer under anyone’s control is likely to become a reality.  


Comments are closed.