Extensive Report Reveals TikTok Has Major Child Pornography Problem

Share:

Read Time: 1 Minute 47 Seconds

TikTok has a major problem with child pornography.

An extensive report from Forbes chronicles a terrifying reality: child pornography—legally known as child sexual abuse material—is easy to come by on TikTok, the short-form video sharing app with one billion monthly active users, making it the sixth most popular social media platform in the world.

To most, Forbes writer Alexandra Levine reported, the posts tied to the criminal handles “typically read like advertisements and come from seemingly innocuous accounts.”

“But often,” she continued, “they’re portals to illegal child sexual abuse material quite literally hidden in plain sight—posted in private accounts using a setting that makes it visible only to the person logged in.”

The CSAM-filled account holders purportedly share illicit content using “post-in-private” settings, meaning the one accessing the photos and videos has to have the account’s login information or use specified phrases, bypassing algorithms that might otherwise result in violations of the app’s terms of use.

Seara Adair, a survivor of child sexual abuse and an advocate for children’s safety, told Forbes she has reached out to TikTok employees, but to no avail. She has tried to alert them to this trend, explaining she believes users have discovered ways to bypass computer-operated and monitored algorithms by posting black-screen videos that only last a few seconds and contain brief instructions for predators.

“There’s quite literally accounts that are full of child abuse and exploitation material on their platform,” she told the outlet. “Not only does it happen on their platform, but quite often, it leads to other platforms—where it becomes even more dangerous.”

Adair said she has seen videos depicting “a child completely naked and doing indecent things.”

For her part, Levine corroborated Adair’s comments, reporting it was relatively simple to access “post-in-private” accounts without any hurdles, while others just required potential predators to contribute their own images before gaining access to the account information. Some account users were reportedly recruiting girls as young as 13 years old.

The issue is hardly unique to TikTok, according to Haley McNamara, director of the International Centre on Sexual Exploitation. She told Forbes all social media platforms are plagued with CSAM.

For the rest of this article, visit our content partners at cbnnews.com.

Reprinted with permission from cbn.com. Copyright © 2022 The Christian Broadcasting Network Inc. All rights reserved.

Bring Charisma magazine home with a subscription today!

+ posts
Share:

Related topics:

See an error in this article?

Send us a correction

To contact us or to submit an article

Click and play our featured shows

Satanic display in the Iowa Statehouse.

Update: Iowa Governor Weighs in on Satanic Display

In recent days, a controversial display has been erected in the Iowa state capitol by The Satanic Temple Iowa. This display, which features a silver ram’s head on a red-caped mannequin, has sparked significant debate among politicians and citizens alike....

Hans and Zulya Schmidt.

Miracle Update on Street Preacher Fighting for His Life

An Arizona evangelist, who was shot last month while promoting a church service on the streets of Glendale, has been moved out of the intensive care unit but is still in a “delicate” condition, according to his family. The incident,...

1 2 3 4 94 95 96 97 98 99 100
Scroll to Top