Extensive Report Reveals TikTok Has Major Child Pornography Problem

Share:

Read Time: 1 Minute 47 Seconds

TikTok has a major problem with child pornography.

An extensive report from Forbes chronicles a terrifying reality: child pornography—legally known as child sexual abuse material—is easy to come by on TikTok, the short-form video sharing app with one billion monthly active users, making it the sixth most popular social media platform in the world.

To most, Forbes writer Alexandra Levine reported, the posts tied to the criminal handles “typically read like advertisements and come from seemingly innocuous accounts.”

“But often,” she continued, “they’re portals to illegal child sexual abuse material quite literally hidden in plain sight—posted in private accounts using a setting that makes it visible only to the person logged in.”

The CSAM-filled account holders purportedly share illicit content using “post-in-private” settings, meaning the one accessing the photos and videos has to have the account’s login information or use specified phrases, bypassing algorithms that might otherwise result in violations of the app’s terms of use.

Seara Adair, a survivor of child sexual abuse and an advocate for children’s safety, told Forbes she has reached out to TikTok employees, but to no avail. She has tried to alert them to this trend, explaining she believes users have discovered ways to bypass computer-operated and monitored algorithms by posting black-screen videos that only last a few seconds and contain brief instructions for predators.

“There’s quite literally accounts that are full of child abuse and exploitation material on their platform,” she told the outlet. “Not only does it happen on their platform, but quite often, it leads to other platforms—where it becomes even more dangerous.”

Adair said she has seen videos depicting “a child completely naked and doing indecent things.”

For her part, Levine corroborated Adair’s comments, reporting it was relatively simple to access “post-in-private” accounts without any hurdles, while others just required potential predators to contribute their own images before gaining access to the account information. Some account users were reportedly recruiting girls as young as 13 years old.

The issue is hardly unique to TikTok, according to Haley McNamara, director of the International Centre on Sexual Exploitation. She told Forbes all social media platforms are plagued with CSAM.

For the rest of this article, visit our content partners at cbnnews.com.

Reprinted with permission from cbn.com. Copyright © 2022 The Christian Broadcasting Network Inc. All rights reserved.

Bring Charisma magazine home with a subscription today!

+ posts
Share:

Related topics:

See an error in this article?

Send us a correction

To contact us or to submit an article

Click and play our featured shows

Bishop Describes Injuries After Stabbing

https://www.youtube.com/watch?v=Hl5YvkTSRHs The world was shocked when an extremist carried out an attack on Bishop Mar Mari Emmanuel at the Assyrian Christ the Good Shepherd church in Wakeley, a suburb of Sydney, Australia, which was caught on the church’s livestream. Following...

Jesse Duplantis: ‘Poverty Is a Curse’

There is a wide range of opinions and emotions within the Christian community when it comes to blessing, prosperity and poverty. Interpretations of various verses in the Bible as well as analyzing the words of Jesus and how He lived...

Cahn Talks Mental Health Amid MacArthur Backlash

Pastor John Macarthur is facing backlash from the Christian community after his statements that PTSD is nothing more than grief. “If you understand, take PTSD, for example, what that really is, is grief. You are fighting a war you lost....

1 2 3 4 5 6 97 98 99 100
Scroll to Top