The new PBS documentary “TikTok, Boom.” examines the birth and explosive popularity of TikTok. It’s told in part through the eyes of influencers who found a voice and home on the platform. The film also explores the dark side of the social media app, including its ownership by Chinese tech company ByteDance, which is subject to the country’s surveillance laws.
TikTok became the most downloaded app in 2020, which coincides with the COVID-19 pandemic and lockdown. The platform has been especially popular among young users — about a third of them were under 14 — which raises concerns because the app collects user data. That’s all according to Shalini Kantayya, the film’s director.
“When a child starts using TikTok at age 10, by the time they're 18, that algorithm can know you better than a parent knows you. That is very powerful artificial intelligence,” Kantayya tells KCRW. “There's a TikToker who makes a joke that ‘why did it take me so many years to figure out I was bisexual, but my TikTok algorithm knew in 22 seconds?’”
Other social media platforms deliver content according to who users follow. However, TikTok employs an algorithm based on what users are watching and for how long.
“You can give it no input, and you can just start watching videos, scrolling. I think there's sometimes this disconnect between who we say we are, and who we are,” Kantayya says. “It's paying attention to who you really are, by how many seconds you're watching each video. … It's sorting videos into groups and sorting audiences into groups.”
Kantayya adds that like other social media platforms, TikTok’s algorithm gathers, or assumes, data about its users, including IP addresses, gender, and other characteristics.
An economic lifeline — at what cost?
While TikTok provides opportunities for creators to make a living off of their videos, it is also home to harassment. For example, Columbia student Deja Foxx, who has more than 130,000 followers and more than 3 million likes on her videos, has been the target of hate.
She says in the film, “I was seeing a therapist at Columbia [University] for the first time ever. And I went into her office, and I was shaking and crying. And she couldn't understand. And she was telling me, ‘Why don't you just delete your social media?’ I was like, ‘What you don't understand is that I can't delete these accounts, because they are what keeps me financially stable.’”
Kantayya says Foxx is an example of just one young person who’s faced the harmful side of social media platforms.
“There was reporting that Instagram was causing teenagers to have eating disorders and to [have] high rates of anxiety and depression, and they hid the data. And so I am very concerned about the use of social media and these fast, very powerful recommendation algorithms and their long-term impact on mental health,” she explains. “Is there some way that we can get the benefits without the harms? And how can we create a relationship with social media that really prioritizes our mental health and our wellness, especially of our young people?”
What TikTok’s content moderation policy says about free speech
In July, FCC Commissioner Brendan Carr urged Apple and Google to remove TikTok from their platforms’ online app stores due to national security concerns.
“There is an example in the film where people in the military are using this and showing some assets. And there's ways in which data can be weaponized. ... Part of it was this idea that a social media platform that had so much power and influence to spread propaganda or express some cultural influence that way.”
Kantayya adds that while the U.S. TikTok offices have tried to distance itself from its Chinese headquarters, concerns still swirl over where the platform’s data goes.
“If you are a Chinese company, and they request data, you have to give them that data. … I also want to say that this was exasperated because of the anti-Chinese xenophobia that was also taking place because of COVID.”
Part of TikTok’s moderation policies include so-called “shadowbanning,” which means blocking an account’s videos from receiving views.
The documentary highlights a video from Feroza Aziz, an Afghan-American creator who made videos calling attention to the imprisonment of Uyghurs in China. Many of her videos were taken down, until she found a way to get around the censors. She buried a political message inside of a make-up tutorial.
“She starts and ends her tech talk with an eyelash curler lesson, and then uses that to talk about the Uyghurs. So the post stayed on [the platform] a number of hours before it got flagged and taken down. At one point, she gets thrown off the platform, she couldn't even open TikTok,” Kantayya explains. “TikTok says that they did not censor the content, that it was a moderation problem. But what is key are these larger issues around content moderation.”
Kantayya says TikTok has since announced it’s reexamined its content moderation policies, but this raises the question about what free speech means, and what it looks like online.
“What ‘TikTok, Boom.’ highlights is that across social media companies … content moderation is happening in the shadows. The truth is that we don't know what's happening with these algorithms or how content is moderated. And I think that we need more transparency and accountability when it comes to how content is moderated on social media, especially when it's becoming the public square of our democracy.”