In a cyber world that’s hard for any of us to fully navigate, the youngest media users are the most vulnerable. Recently The Wall Street Journal went on a cyber-undercover mission to see what was really happening on the TikTok video-sharing platform. To begin with, kids and teens love the quick video reels on TikTok. They can be funny and exciting, and to young viewers quite entertaining.
TikTok users must be at least 13 years old — and have parental permission to use the app. But there aren’t adequate verification measures in place to assure this. This was how The Wall Street Journal investigators were able to set up 31 fake TikTok accounts for “teens” ages 13-15.
TikTok users are offered “For You” feeds — videos that are recommended by TikTok that might interest viewers. As users spend time on the app, more of what they like watching is sent to them. Algorithms note the time spent watching and the video content and quickly assess what should come next.
For these 31 “teens” the investigators noted what was offered to them and it was deplorable.
Their TikTok accounts received hundreds of drug use videos, some featuring meth and cocaine. Sexual role-playing and videos about bondage, sadomasochism, and sexual practices were sent to other fake accounts. Many of the accounts were given links to sign up for OnlyFans.com — be careful if you check this one out, it’s a website promoting subscriptions to pornography.
Since the app users are tracked, it makes it easy to send similar recommendations of other TikTok videos. The idea is to keep consumers engaged with the app. But what if the app user is underage and the content is totally inappropriate? TikTok has been reluctant to respond to requests for comments about this Wall Street Journal investigation. A spokeswoman did indicate that they are looking into technology to filter out adult content for underage users.
This isn’t a new problem in the cyber world. YouTube faced the same issues. This happens because the app uses engagement-based algorithms — meaning that if a user seeks to look at porn, then the app learns what the user likes and will send more. The issue with TikTok is that it has a much more advanced algorithm process. Hence, young teens are getting far more than they bargained for because of their natural born curiosity. Beware, pornographers know young tweens and teens are ripe for the taking. Get them hooked and they will be new consumers.
According to "Fight the New Drug" pornography consumption is like getting high on drugs. It hits the pleasure center of the brain, and like drugs it takes more to achieve the same high with subsequent use. With pornography, a better high involves more extreme forms of porn. The same porn scenes won’t elicit the pleasure response the brain seeks, so to get the jolt of dopamine the brain wants, raunchier porn is needed. And then more. Obviously, this is porn addiction, and there will be subsequent issues of having those needs met in unsafe and unhealthy ways. Imagine this happening with kids.
Pornography has also fed a human-trafficking monster with many underage kids as its victims. Here is often how it happens: a child/teen is coerced or intimidated into performing sexual acts against his/her will. Drugs are often involved. Teen runaways often become victimized. It becomes a cycle of despair and hopelessness for these kids. The porn industry is fueled by consumers. Every click on a porn site fuels new demand for sex traffickers. According to “Fight the New Drug,” 63% of underage sex trafficking victims said they have been advertised or sold online.
So, when parents hand a smart phone to their child, they may inadvertently be allowing their child to become susceptible to pornography. While having talks with kids about the dangers is vital, more filters need to be in place with the app providers. But this will never be enough.
It’s time to criminalize the porn purveyors — including those who send it via their apps. We don’t have to wait and see if laws can be changed. We can look to Minnesota who has already taken a huge step towards halting human trafficking. Minnesota passed legislation recognizing that porn and human trafficking are connected. While the legislation doesn’t restrict First Amendment rights, make porn illegal, or be censored, it does give law enforcement the tools they need as they hunt down child sex traffickers. It imposes fines on those caught with child porn possession. Even in times when everything seems polarized and politized, this legislation passed unanimously in both houses of the Minnesota legislature.
This is really who we are America. We don’t want our kids succumbing to porn addiction through phone apps. We don’t want our kids sex trafficked. We can be like Minnesota, and we must. It’s time for 49 more states to do the same.
Karen Farris saw the need to help underserved kids while serving in a youth ministry that gave her the opportunity to visit rural schools on the Olympic Peninsula. She now volunteers her time grant writing to bring resources to kids in need. She also shares stories of faith in action for those needing a dose of hope on her weekly blog, Friday Tidings.www.fridaytidings.com