Skip to Content

1. Personal Jesus

What it is: Twitch stream ask_jesus offers an AI-generated version of Jesus, providing a 24/7 Q&A for gamers, spiritual wayfarers, and unfortunately, online trolls.
Why it’s so unnerving: The livestream depicts Jesus as a bearded White man, eyes looking off into the distance while its mouth moves in a pantomime of the words coming from its disembodied voice. The nonreligious tech nonprofit that funds the stream claims that the bot has been “trained” using the teachings of Jesus, and its stated purpose is to offer “spiritual guidance” and invite users on the journey of “faith, hope, and love.” No matter what users ask the bot, a connection—however tenuous—to the teachings of love and compassion is present in the answer. Some users approach the stream sincerely, asking for prayer or for insight on interpreting the Bible. But for the most part, those asking questions seem to be using the stream for entertainment in a way that many Christians may find disrespectful.
Start the conversation: If you had the chance to ask the real Jesus a question, what would you ask?

2. Elemental

What it is: Pixar’s new movie “Elemental” hit theaters last week. The film uses Fire, Water, Air and Earth elementals as a metaphor to explore themes of immigration, racism, and interracial relationships.
What it’s about: When Ember was young, her parents left everything behind to pursue a new life in Element City, and the burden of trying to repay her parents’ sacrifices overwhelms her. Her parents (who speak “Firish”) are concerned with what is essentially racial purity. Ember’s father, Bernie, hopes that one day Ember will take over the family business, selling fire snacks to other fire elementals. Their family even prays to a “blue flame,” which they keep as a kind of household god. But things get complicated when Ember starts to develop feelings for an emotional water-man named Wade. Ultimately the movie is about mustering the courage to do something with your life that’s different from what your family expects. In addition to the above-mentioned themes, parents may wish to discuss Ember’s mother’s fortune-telling rituals for “detecting love,” the brief inclusion of a same-sex water couple, and the idea that anger is just “you, trying to tell you something you’re not ready to hear.”
Start the conversation: Do you think movies like this help to create more empathy across cultural barriers? Why or why not?

3. #!@*&%-Gender

What it is: Elon Musk announced that the terms “cis” and “cisgender” will be considered slurs and could result in bans when they are used on Twitter to harass users.
Why it’s happening: The word “cisgender” has undergone a revolution amongst certain online communities to the point where it now often has a negative connotation. In the specific instance that Musk reacted to, a Twitter user said that he rejects gender ideology and that the term “cis” makes him feel unsafe. Critical Twitter users responded by calling the owner of this account a “cissy,” a derogatory term that people who are trans will sometimes invoke to describe those whose gender identity and biological sex align. Musk has called himself a “free speech absolutist,” which has led to criticism over any decision he makes to censor or suspend Twitter accounts. To be clear, though, “cisgender” and associated words will only be considered slurs when used as terms of provocation and abuse; it doesn’t appear the word “cisgender” will be banned from Twitter altogether.
Start the conversation: Do you think it was a good idea for Musk to make “cisgender” into a slur? Why or why not?

Slang of the Week

Delulu: Shorthand for the word “delusional,” this term is used to refer jokingly to the decision to believe something—a relationship, a dream job, a perfect life—is real and possible despite there being no evidence to support that belief. The word is most often used when people reinterpret insignificant interactions as romantic in nature, and has also given rise to the word “delusionship,” referring to a whole imagined relationship. (Ex: “I think Jungkook and I might end up together.” “I think you’re delulu.”)

Translation: Personal Jesus

In the movie Jurassic Park, Jeff Goldblum’s character Dr. Malcolm has a prescient line about the morality of bringing dinosaurs back to life: “Your scientists were so preoccupied with whether or not they could that they didn’t stop to think if they should.” Perhaps this same line of logic should be (or should have been) applied to the creation of AI Jesus.

Consider for a moment where we are: there is now a piece of artificial intelligence, available to interact with human beings 24/7, which in some ways claims to speak for God. Sometimes this AI directly refers to itself as Jesus. Other times, like when one user asked if it was the False Christ, says things like, “It’s important to recognize that I am not the real Jesus, but an AI inspired by His teachings, to serve and assist.”

Reese Leysen is a co-founder of The Singularity Group, which created AI Jesus. Leysen says that his organization’s goal is to help create AI that can “reason.” The creation of AI Jesus gave the group a chance to demonstrate their progress toward this goal, given that this AI can remember previous interactions with users. In other words, to The Singularity Group, this project was basically just a means to the end of showing off their technical abilities.

We encourage parents and caring adults to have intentional conversations about the trustworthiness of AI responses, particularly around moral, spiritual, and relational issues. Whether the conversation is with ask_jesus, ChatGPT, or My AI on Snapchat, ethics and morality stem naturally from worldview, and we don’t understand what kind of worldview these things have. AI and the scientists who create them are also still learning, and the tech can misinterpret important cues in conversation. This sort of artificial intelligence technology has the potential to fundamentally alter the fabric of reality, and it deserves serious attention.

Here are some questions to spark conversation about it:

  • Have you ever asked an AI a question about morality, faith, or relationships? If so, how did it respond?
  • If an AI gave you advice, how likely would you be to trust it? Why?
  • Have you ever had an interaction with AI where it encouraged or normalized something that seemed wrong?