In 2022, Mashable published an article called “TikTok’s Algorithms Knew I Was Bi Before I Did.” We won’t link to it here, but the article stands as an example of why media literacy with algorithms is so important.
The writer asserts that upon opening TikTok, the algorithm “begins reading your soul like some sort of divine digital oracle, prying open layers of your being never before known to your own conscious mind.” She concluded that the algorithm knows us better than we know ourselves—and that we should take whatever appears in our feed to be some sort of sign. In her case, posts about being bisexual helped persuade her that she was, in fact, bisexual.
Social psychologist Jonathan Haidt argues that all social media platforms are made up of four defining features: user profiles, networking, user-generated content, and interactivity. And as Haidt points out, algorithms are driven primarily by liking, favoriting, viewing, and commenting (interactivity) on photos, videos, and even DMs (user-generated content).
In other words, the algorithm is a mirror—not an oracle. It reflects back to us what it already sees us engaging with. And even though many social media platforms use “variable-ratio schedules” to give users what they really want only some of the time (which keeps us coming back), it can still create personal echo-chambers, where a user mostly sees things that confirm their pre-existing view of the world.
In a letter to his protegé Timothy, the Apostle Paul warns, “For the time will come when people will not put up with sound doctrine. Instead, to suit their own desires, they will gather around them a great number of teachers to say what their itching ears want to hear.” And in a way, this is exactly what algorithms are designed to do.
A core part of the Christian faith is being willing to admit that the “echo chamber” of our own hearts can be deceitful, and that we need someone to step in and redeem us. But social media and algorithms can feed the destructive perspective that our way is always best (or “normal”).
In a recent interview with Andy Crouch, he was relaying a conversation with his daughter about her decision not to use any app that filters her experience through an algorithm. Her rationale was that, “The algorithm does not catch you at your best. The algorithm catches you at your worst. It catches you at your most impulsive, your most reactive.” Indeed, Christians who believe that we have a sin nature can understand that our automatic responses to what we come across won’t always represent the best versions of ourselves.
Some platforms have “Not Interested” or “Don’t Show Me This” options that can help users reshape what their algorithms serve up. But a broader way to curb the effect of these would-be echo chambers is by simply limiting the amount of time we spend engaging with them. Parents might ask themselves, “Do I know how much time my family is spending on social media?” and, “Do I know how much time I’m spending on social media?” At least on Apple devices, Screen Time holds the answer to these questions and the ability to set up intentional time limits.