TikTok has been making headlines lately, and not for good reasons. TikTok’s algorithm has come under scrutiny due to concerns raised by law-enforcement officials and child-exploitation experts who worry that it may facilitate the sexual exploitation of minors. These concerns stem from how TikTok’s algorithm suggests content and users to viewers.
One of the most downloaded social media apps ever, TikTok has become a magnet for children and teens who spend more time there each day than on any other social media platform. TikTok has gained immense popularity among American teenagers since its launch in North America several years ago, with 67% of teens reporting that they use the platform and 16% using it frequently. Facebook, the dominant social media platform among teens in 2014-15, has seen a significant decline in usage, with only 32% of teens using it now.
The platform’s system is designed to learn what type of content users like, then feed them a steady stream. It’s highly addictive. This keeps kids glued to the site and makes it easier for pedophiles to seek them out. All social media platforms are designed to be engaging and rewarding, providing a steady flow of notifications, messages, and alerts that trigger the dopamine release in the brain, a neurotransmitter associated with pleasure and reward. TikTok, however, is particularly effective in this regard.
While users can restrict their TikTok content to only be visible to their family and friends, many teenagers opt to make their posts public to accumulate “likes.” They also do this to take advantage of other trendy features, such as the ability to create a split-screen video with a stranger’s clip. Unfortunately, this functionality has introduced a new peril of sexual exploitation.
The content and associated worries are growing as rival companies endeavor to emulate TikTok’s accomplishments by adopting the short-video style. Given the tremendous volume of videos flooding the platform, specialists in child exploitation are skeptical of TikTok’s ability to thoroughly oversee them for any unsuitable conduct.
Although TikTok claims to have measures in place to safeguard younger users, experts in child exploitation doubt the platform’s ability to effectively monitor the massive quantity of videos for inappropriate behavior. The videos often reveal personal information, including visual details of their surroundings and location, providing predators with the means to easily locate and groom their potential victims.
Parents must understand the potential hazards of their children using TikTok and other social media platforms. It’s critical. They should take the initiative to educate themselves and their children on staying safe when using these apps. But in the final analysis, TikTok and other social media platforms should bear a greater burden for safeguarding young users from sexual predators. This mandate should be enforced at all costs. Now.
(This post summarizes two articles on this topic – linked – and was primarily generated by ChatGPT. It was grammar-checked using Grammarly, edited, expanded, and validated by a real human. The image was generated by Midjourney using the prompt: “Social media platforms are designed to be engaging and rewarding, providing a steady flow of notifications, messages, and alerts that trigger the release of dopamine in the child’s brain, an abstract illustration” )