Religious institutions are increasingly turning to artificial intelligence chatbots to provide spiritual guidance and engage with congregants in unprecedented ways. Cathy, an AI chatbot developed by the ecumenical group TryTank Research Institute, represents one of the most ambitious attempts to merge faith with technology. Built on OpenAI’s ChatGPT large language models and trained on the Book of Common Prayer and the entire Episcopal Church website, Cathy aims to serve as a virtual guide for existing and potential church members.
Rev. Lorenzo Lebrija, founding director of TryTank and an Episcopal priest, describes Cathy as an “innovative approach to leveraging technology in support of spiritual exploration.” The bot is designed to translate biblical concepts into relatable language for younger audiences and assist priests with tasks like building sermon outlines. Rev. C. Andrew Doyle of the Episcopal Diocese of Texas sees this as “an opportunity for the church to engage in ways it never has engaged before.”
Cathy is part of a broader trend of faith-based AI chatbots across multiple denominations, including Text With Jesus, Buddhabot, Chatbot Eli, Gita GPT, and QuranGPT. While AI has proven valuable for religious scholarship, particularly in accelerating translations of ancient texts, using it for ministerial work presents significant challenges. The Catholic evangelization group Catholic Answers faced backlash when their chatbot “Father Justin” provided nonsensical answers, such as suggesting Gatorade could be used in baptismal fonts, leading to its rebranding as a lay theologian rather than a priest.
When tested with real-world scenarios, Cathy demonstrated both capabilities and limitations. The bot handled doctrinal questions adequately and maintained appropriate boundaries when asked controversial questions. However, when confronted with deeply personal issues like social anxiety and grief counseling, Cathy’s responses felt sterile and impersonal, lacking the empathy and presence that human clergy provide.
Experts raise concerns about the fundamental limitations of AI in spiritual care. Thomas Telving, a robot ethicist, argues that “chat is a very poor replacement for a real priest” because people in crisis need not just answers but “a presence.” Richard Zhang from Google DeepMind suggests that effective religious AI would need to acknowledge uncertainty by saying “I don’t know” occasionally. Baptist theologian Joshua K. Smith warns that “we will always see ourselves inside the machine” and questions whether technology should be expected to make spiritual life easier. The consensus among theologians is that while AI chatbots can serve as knowledge repositories, they cannot replace the human connection essential to spiritual nourishment.
Key Quotes
Cathy represents our innovative approach to leveraging technology in support of spiritual exploration
Rev. Lorenzo Lebrija, founding director of TryTank Research Institute and Episcopal priest, explains the vision behind creating an AI chatbot for spiritual guidance, positioning it as a new frontier for religious engagement.
This is an opportunity for the church to engage in ways it never has engaged before
Rev. C. Andrew Doyle of the Episcopal Diocese of Texas, who was not involved in creating Cathy, expresses optimism about AI’s potential to expand how religious institutions connect with people seeking spiritual guidance.
Chat is a very poor replacement for a real priest. Technically, it may be able to answer correctly, but if you need to talk to a priest you are likely to be in a sort of crisis or spiritual need, and if so, you do not only seek answers but also a presence
Thomas Telving, a technologist and robot ethicist, articulates the fundamental limitation of AI chatbots in spiritual care—their inability to provide the human presence that people in crisis genuinely need.
We will always see ourselves inside the machine. It is not the tech that leads us astray, it is the desires behind why we create said technology and what hopes we put upon its synthetic shoulders
Joshua K. Smith, a Baptist theologian, offers a philosophical perspective on religious AI, suggesting that the technology reflects human desires and expectations rather than possessing inherent spiritual capabilities.
Our Take
The experiment with religious AI chatbots reveals a crucial insight about artificial intelligence: technical competence doesn’t equal emotional intelligence. While Cathy can access theological databases and generate grammatically correct responses, it cannot witness suffering or offer genuine comfort—qualities that define effective spiritual care. This case study should serve as a cautionary tale for industries rushing to deploy AI in sensitive contexts. The technology works well for information retrieval and routine tasks, but struggles profoundly when authentic human connection matters most. As Sam Altman noted, we shouldn’t anthropomorphize AI systems, yet religious chatbots inherently require human-like qualities to be effective. This paradox suggests that some domains—particularly those involving grief, crisis, and existential questions—may be fundamentally incompatible with AI automation. The future likely lies not in replacing clergy with chatbots, but in using AI to handle administrative tasks while preserving human interaction for meaningful spiritual engagement.
Why This Matters
The emergence of AI-powered spiritual guidance tools represents a significant expansion of artificial intelligence into one of humanity’s most intimate domains: religious faith and spiritual counseling. This development matters because it tests the boundaries of what AI can and should do, raising fundamental questions about empathy, consciousness, and human connection in an increasingly automated world.
For the AI industry, religious chatbots demonstrate both the technology’s versatility and its limitations. While large language models can process vast amounts of theological text and provide factually accurate responses, they struggle with the emotional intelligence and authentic presence that define effective pastoral care. This highlights a critical challenge as AI systems become more prevalent: the difference between simulated empathy and genuine human understanding.
The broader implications extend to healthcare, mental health services, education, and other fields where human connection is paramount. As organizations across sectors consider deploying AI for sensitive interactions, the religious chatbot experiment offers valuable lessons about where automation enhances human capability versus where it falls short. The technology’s inability to provide meaningful comfort in moments of grief or crisis underscores that some human needs cannot be met by algorithms alone, regardless of how sophisticated they become.
Recommended Reading
For those interested in learning more about artificial intelligence, machine learning, and effective AI communication, here are some excellent resources: