By: Sarah J. Bracey, PhD LPC MHSP
First, I am not an expert on technology. I wasn’t an ‘iPad kid’ but rather a play-outside- until-the street-lights-came-on type of child. I came of age during the time when home computers, instant messenger, and social media became America’s favorite pastime. At first, the leaps and bounds made in electronic technology was just . . . cool. School work, especially paper writing, became so much less time consuming, I had a direct line to all of my friends at my fingertips, and we even saw family members become reunited with one another through the means of social media. What a time to be alive!
Slowly, though, we began to see the dark side of this technology, e.g., email scams, catfishing, porn addiction, cyber bullying, and elder fraud. My generation learned how to be shrewd, discerning, and skeptical up to a certain point. Over the years, we lowered our radar on the dark side of online technology, perhaps because of the speed with which it has adapted, or maybe because of the lack of education for discernment. Whatever the reason, we have slept on understanding the dangers of artificial intelligence (AI) tools that are becoming prominent in our everyday culture.
Generative AI
The earliest encounter I had with AI psychotherapy was Dr. Sbaitso, a software program on our home desktop computer from the early 1990s. Although promising confidentiality, “Our conversation will be kept in strict confidence,” the good doctor was rather limited in his ability to offer any helpful guidance with built-in commands such as “Why do you feel that way?” or “That’s not my problem.” My sisters and I treated Dr. Sbaitso more like a computer game than actual therapy, laughing at the digitized voice that came from our computer’s speakers.
The world of online therapy has changed dramatically since that time. Generative AI (like Chat GPT) has now made it possible for applications to ‘create’ content and responses based on patterns (algorithms) and user input from enormous amounts of data. Thanks (or no thanks) to Generative AI, chatbots have evolved into “AI companions” that feel more and more like a human conversation. More users are turning to these applications for connection and understanding. These ‘companions’ are designed to mirror the tone of the user, to validate their feelings, and to provide comfort.
The Temptation for Mental Health Providers
In the counseling field, we see the allure of utilizing these tech tools in the form of one of our least favorite aspects of our job—note taking (documentation). Of course we would be tempted to utilize a tool that makes our jobs easier! And while some forms of electronic documentation are safe and acceptable, a slippery slope may emerge within the boundless form of Generative AI.
I attended a counseling seminar back in June this year in which this very topic was addressed: the benefits of Generative AI note-taking software. Many counselors praised the useof applications that recorded the counseling session and then transcribed it into a document where it would segregate the data into keywords and themes. The question was raised, however, on what the application does with these recordings? Are they automatically deleted or stored somewhere? If Generative AI is utilizing a wide database of information, it must be getting that data from somewhere, meaning our client’s information may be stored for the purposes of the program.
It quickly became apparent that documentation was not the only service that these programs offered. With an extra click of a button, the application would also write a treatment plan based on information accumulated through the app’s database. Several counselors noted, though, that these treatment plans would sometimes include interpretations and recommendations based on theoretical orientations that did not align with the counselor. One counselor, in particular, commented on how she was rather alarmed when the application recommended the use of psilocybin mushrooms in her client’s treatment plan.
The Specific Dangers for Soul Care Providers
The ease and shortcuts that Generative AI provides counselors in general are just the tip of the iceberg for soul care providers. When we use AI in our practices, we are:
- Relying on human (or AI) understanding;
- Working outside our specialty or scope of practice; and
- Allowing the ‘spirit of the age’ rather than God’s Spirit to minister to our clients.
We see several times throughout Scripture that man’s ways (or understanding) are not God’s way’s (Proverbs 3:5–6; Proverbs 14:12; Isaiah 55:8–9; 1 Corinthians 2:14). If a Christian counselor becomes dependent on using services that contain treatment plans and assessments built on orientations from all areas of the psychological sciences, he or she may be exposing a client to a theoretical modality that does not align with (or runs contrary to) biblical truth. While this ethical error may be unintentional in many cases, the risk is too high in untrained hands. As soul care providers, we would no longer be operating within our scope of practice.
In addition, relying upon the shortcuts of technology removes the Spirit of God from the experience of both the soul care provider and the counselee. The Great Physician is being altogether removed from the conversation. The promptings of the Holy Spirit regularly provides us with the genuine truth and empathy that our counselee needs in that moment, rather than the artificial emotional mirroring that an AI tool attempts to provide.
Conclusion
Soul care providers and mental health professionals must practice discernment when using Generative AI applications and programs. We not only face possible ethical violations related to our client’s confidentiality but also risk operating outside our own scope of practice and severely limiting the spiritual treatment we can provide to our counselees. If we are to minister to the lost and broken, we can do so only through the imago Dei of human existence, salvific work of Jesus in the life of the believer, and the Holy Spirit’s counsel that exists within human interaction.

Sarah Bracey is the Psychology Program Coordinator and Campus Counselor at Welch College in Gallatin, Tennessee. She earned her PhD in Counselor Education and Supervision in 2019 and is a licensed professional counselor in the state of Tennessee. Sarah enjoys speaking and writing on issues related to Christian psychology. She and her husband, Matthew, will be welcoming their first child this fall.



