TikTok, a social media app popular especially with Generation Z, has a Chinese parent company, ByteDance Ltd. And as with all things Chinese since June of 2019, TikTok has been subjected to an outpouring of Sinophobic sentiments. The app has over 1 billion monthly active users globally, 150 million of whom are in the US. Further breakdown of its US user base shows it is composed mostly of 18-to-35-year-olds, many of whom are from lower socioeconomic groups. This becomes more important later when we dig into the unmanned algorithms that, once programmed, are the backbone of TikTok and every other social media platform.
In 2020, then-US president Donald Trump threatened to ban TikTok if it did not divest from ByteDance. This move came after Senator Chuck Schumer and other members of the US legislative branch in October 2019 called upon the then-acting director of national intelligence, Joseph Maguire, to arrange a review of TikTok and its Chinese parent company. Schumer et al cited national security concerns over the app’s data collection algorithms.
The word “algorithm” has an unintended tendency to evoke a sense of inferiority in some because of its connotations of some abstract complexity beyond an educated layman’s capability to understand how social media apps are programmed. Nothing could be further from the truth. A social media app’s algorithm is, at its core, just a longer-than-usual Form 3 algebraic quadratic equation.
And still, TikTok’s algorithm and data collection, described as being “potentially dangerous” to US national security, were the subject of a House Energy and Commerce Committee congressional hearing on March 23, at which Singaporean TikTok CEO Shou Zi Chew, who also happens to be a Harvard MBA graduate, was questioned. The hearing attempted to play on the false ignorance of computer programming complexities and, at times, bordered on blatant prejudice and racism.
At one point in the five-hour bipartisan hearings, Chicago Congresswoman Jan Schakowsky admitted to the TikTok CEO that the concerns expressed about TikTok at the hearing represented a double standard. Schakowsky, in response to Chew’s repeated and legitimate assertions that TikTok’s algorithms were aboveboard and based on industry practices often better than those of Silicon Valley’s, said that she preferred not to “go by that standard”.
I spoke to a software programmer, and what I learned lent credence to the thought that this hearing was more performative than a sincere attempt at addressing US national security concerns over TikTok’s data collection. The following is what I learned:
First, all social media apps that are free of charge and which earn consistent dividends for their investors do so from almost preternatural correlational data collected by algorithms hardwired into the app’s functioning.
Second, algorithms can be manipulated, but not to the extent the US is claiming TikTok is “nefariously” doing.
And third, to achieve what the US claims TikTok is doing, the company would need to hire round-the-clock teams of thousands of employees specifically focused on tweaking the algorithms hour by hour, the cost of which would be astronomical.
The reality, I learned, was that computers really like math, and they don’t like it when a human creates an algorithm where its underlying equation is forced to equate 7 with what its operational order says should be 8.
I also learned that programmers in Silicon Valley and elsewhere do not like having to reinvent the wheel: Once an algorithm has proved profitable in its ability to predict accurately things to advertisers like when a user will need a tire change, other social media algorithms will use the same algorithm with perhaps small variations. These algorithms rely on artificial intelligence machine-learning to collect data and then run the relatively simple A-math correlation coefficient formula at a massive scale to find correlations the human mind might never notice. I learned that social media apps like Facebook measure the amount of time and levels of intensity a particular user interacts with a particular piece of content and surmise that that user wants to see more of that content. It felt liberating to have this simplicity, masking as complexity, explained to me.
On social media sites, this false sense of ignorance is exploited and signal-boosted by verified YouTube channels that claim TikTok’s algorithms function differently. They posit that TikTok actively shows the West’s culturally destructive content, like the “milk crate” challenge of a few years ago, or more damaging content that promotes life-altering measures such as gender reassignment and puberty blockers for 8-year-olds. Meanwhile, in China, this same algorithm shows people culturally constructive content like science experiments and acts of benign nationalism.
But the most important lesson I learned was that, in reality, the reason why TikTok in the West shows subscribers things that are culturally destructive, while in China it’s almost the complete opposite, is a reflection of the differing cultural values between the collective West and China.
The author is a writer, columnist and historian based in Hong Kong.
The views do not necessarily reflect those of China Daily.