March 24, 2022
Ever feel like technology can read your mind? Maybe you were just talking about something with a friend, and then presto! There it is on an ad in the middle of your Instagram feed.
During the Super Bowl this year, Amazon ran an ad (surprisingly) poking fun at this idea, featuring celebrity couple Scarlet Johannson and husband Colin Jost, a comedian from Saturday Night Live. In the ad, Amazon’s AI assistant, Alexa, reads each spouse's mind and confirms item orders in humorously awkward moments. What was strange about the ad is that most people in fact do feel like Amazon can read their mind - and they don’t like it.
But now we’re learning the inside story of an entirely new level of algorithm-driven mind reading: TikTok. While the platform itself isn’t new, its explosion of popularity, and unprecedented ability to keep users engaged is prompting new study and concerns.
Now a new study from the Wall Street Journal is pulling back the curtain on how the social media app works with implications for all of us - especially kids.
If you have somehow managed to avoid hearing about TikTok in recent years - first, congratulations! For a bit of history, the video streaming app first debuted in 2016, created by a Chinese company called ByteDance. The platform is based on short uploaded user videos and in early days, grew popular among kids with trends in lip syncing and dance videos.
It wasn’t long before TikTok was popular far beyond youth. Experts were quickly noting the level of sophistication the app’s algorithm possessed and its uncanny ability to predict what users want. The app also raised concerns among privacy experts for its origins in China and potential data gathering on users worldwide by the Chinese government.
For a picture of just how quickly the app has grown in users as well as usage, consider the following stats:
To understand the TikTok story, we have to understand just a bit about algorithms.
What is an algorithm exactly? Simply put, algorithms are webs of sophisticated logic that take data inputs to create optimized outputs. In the case of social media, inputs are generally things like “shares”, “likes”, “interests”, ”friends”, “location” and so forth. It’s not hard to see how places like Facebook or Snapchat quickly create incredibly detailed profiles of us given the data we actively and passively provide.
Just think of all the ways you engage with a platform. Maybe you “like” a picture, or “share” a funny post with a friend. Maybe you search for a person or term with some frequency or follow a particular hashtag. All of these are algorithmic inputs that provide social media platforms - Facebook, Instagram, Snapchat, TikTok and others - with information to be able to better serve you content that you’ll find interesting and engaging.
The 2020 Netflix documentary The Social Dilemma highlighted clearly how the business model of social media companies makes this necessary.
“We’ve created a world in which online connection has become primary. Especially for younger generations,” says computer scientist Jaron Lainer in the film. “In that world, anytime two people connect, the only way it’s financed is through a sneaky third person who’s paying to manipulate those two people.”
“We’re the product,” warns Justin Rosenstein, former programmer at Google and Facebook in the film. “Our attention is the product being sold to advertisers.”
In recent months, the Wall Street Journal set up their investigation to break down how TikTok’s secret algorithm really works. Using “bots” or digitally operated profiles, the team prepared fake users to engage with the platform in certain predetermined ways, allowing them to study how TikTok’s algorithm responded. Each of the bots were given a few undisclosed interests and programmed to engage slightly more with content that reflected those interests.
It didn’t take long for the rabbit holes to begin to open. Any bot profiles with proclivities toward something like “sadness” would quickly end up down a tunnel of suicidal content. Or if a bot lingered on more sexually suggestive or violent content, the tunnel would open wide in those directions, feeding more and extreme version to the user.
Instead of acting like most social media algorithms, driven by proactive actions like shares and likes, the Wall Street Journal investigation found that on TikTok, the most important input was attention. Put another way, it’s simply the time (measured down to the nanosecond) that a viewer lingers over a piece of content.
Why does this matter? Well, basing an algorithm on tiny increments of attention changes everything. While hitting “like” on a post is more of a proactive, conscious action, spending an extra .02 seconds on an explicit video is less avoidable and very subconscious. It’s not hard to see how something like this rapidly takes users and flings them toward the most far-reaching, extreme edges.
“The algorithm is pushing people toward more and more extreme content so it can push them toward more and more watch time,” says Guillaume Chaslot former Google engineer and algorithm expert in an interview with the Wall Street Journal.
It’s no surprise that more and more parents are concerned. Pressure has been mounting on social media giants like TikTok and Meta, owner of Facebook and Instagram, to become more transparent and change their algorithms to be less addictive.
During the State of the Union address earlier this month, President Biden put an unprecedented spotlight on Big Tech, vowing to “hold social media platforms accountable for the “national experiment they’re conducting on our children for profit.”
While new legislation and more pressure may help eventually, parents need a proactive plan in the meantime. Let’s be clear: there is no silver bullet or one-size-fits-all answer when it comes to something like TikTok. But parents need to understand the uphill battle they face and being proactive means everything. Here are some ideas to get you started:
For parents of young kids, the best advice of all is to start smart. Many parents end up inadvertently opening the door to apps like TikTok simply because their kids need some means of connection and communication. But just because they need a phone doesn’t mean a kid is ready for an app like TikTok.
Think of it like a pool: we wouldn’t send a kid into the deep end without making sure they know how to swim first. And yet, often we simply hand young, growing minds powerful devices designed for engagement, extreme content, and manipulation.
The good news is that better, safer, simpler options exist! Kids smartwatches and kids phones are offering parents and caregivers better options that open the door for safety and connection without all the other concerns. Using devices like this can help delay kids engaging with apps like TikTok until they’re older and better equipped.
Sometimes simply saying “not yet” is the right approach. While parental controls can often be faulty and easily hacked (even on iPhones), settings and third party services do offer some level of protection and ability to prevent kids from downloading particular apps like TikTok.
Along the way, be sure to make saying “no” part of a larger conversation around tech, health, and safety. Kids need to understand why they’re not yet allowed. We love the idea of making tech “maturity dependent” rather than age dependent. Help your kids understand the kind of responsibility and understanding you expect in order to prove they are ready for the next tech step.
For kids who might be a bit older, be sure to stay engaged and encourage limits. Knowing how TikTok’s algorithm pulls kids in should make adults more conscious of setting clear boundaries and helping kids own those limits for themselves. Try talking with your kids about how much time they think they should have on TikTok each day. Their suggestion may be a bit much, but engaging them in the dialogue will help them feel a critical sense of ownership.
“Self-started behavioral change refers to the internal ability of a child to set boundaries for themselves,” explains Katherine Winter-Sellery, parenting expert and 3-time TEDx speaker. “It’s their ability to say ‘this is enough for today, I’m going to do something else’ without any external pressure or input.”
Kids need coaching and training when it comes content online. TikTok is full of inappropriate content that kids will likely come across if they spend long enough on the app. One way to engage kids in their learning journey is to spend time on the app with them and make content they see part of an ongoing conversation.
Having these kinds of conversations goes a long way to building bridges of trust between parents and kids as they engage with the digital world.
“True consideration is currency.” says Winter-Sellery. “When we consider our children’s opinions and views, they are far more likely to consider what we are saying in return.”
Have thoughts on TikTok? Drop a comment below or send us a message on social media and tell us what you think.
You can watch the full Wall Street Journal investigation summary HERE
Comments will be approved before showing up.
January 23, 2023
January 19, 2023
November 24, 2022