ChatGPT was initially released for public use on November 30th, 2022. Five days later, over one million users had already explored the bounds of this new and fascinating tool.1Bernard Marr, “A Short History of ChatGPT: How We Got to Where We Are Today,” forbes.com, May 19, 2023, https://www.forbes.com/sites/bernardmarr/2023/05/19/a-short-history-of-chatgpt-how-we-got-to-where-we-are-today/. This release of ChatGPT was a major milestone in the development of artificial intelligence (AI) and in what has been labeled “large language models” (LLMs), a type of AI that interacts through human language in “chatbot” formats.
These AI chatbots are powerful tools. Using one is like conversing with someone who can mine the depths of the internet in a matter of seconds. If you have a question that you want to research by exploring what is publicly available online, you no longer need to skim through endless pages of content. Simply ask ChatGPT for a summary of this information, and it will be collated for you instantly. Not only that, but AI chatbots can also generate new content, such as writing computer code, essays, and counseling case plans, and they do so with striking consistency, clarity, and cogency. It’s no wonder that so many people are turning to these tools to expedite their work in both the gathering and synthesizing of information.
However, it’s no secret that AI also comes with many concerns across a broad spectrum, ranging from ethical issues like plagiarism to concerns about job losses to AI. Somewhere on this continuum is the relevance of AI to biblical counseling.2AI use for therapy is already a concern to the secular world. For instance, there have been multiple lawsuits against Character.AI due to allegations that this chatbot has encouraged its users to commit suicide, and in at least one case, the user did in fact commit suicide: https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists#:~:text=APA%20met%20with%20federal%20regulators,director%20of%20health%20care%20innovation. In addition, there have been disturbing records of chatbots glamorizing self-harm and encouraging minors to kill their parents: https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit. Although some are more optimistic about the use of AI in therapy, there are also very serious concerns. Biblical counselors ought to be wary of the possible uses and misuses of this technology, not shying away from dealing with such pressing topics.
It can be summarized in a question you may have already considered: should I, as a biblical counselor, use ChatGPT in my preparation for counseling or even for quick searches during a session? In this article, I identify three concerns to consider before using ChatGPT (or similar AI platforms) in your biblical counseling ministry: these include authority, authenticity, and application.
Caveat
A word by way of clarification should be stated up front. It is not my position that the use of AI tools is always wrong. In fact, tools like ChatGPT may be particularly helpful for information gathering and research (acting as a research assistant). This article primarily focuses on the issue of using AI tools in counseling content creation, such as relying on ChatGPT to create case plans, session agendas, and homework assignments. It should also be noted that as AI use increases, there is a real possibility of using AI tools as functional co-counselors during a session. If you get stuck on an issue while counseling, what’s wrong with turning to a chatbot mid-session for some additional pointers? This article seeks to start a conversation on such issues.
Consideration #1: Authority
The first concern that warrants our consideration is the issue of authority. Since biblical counseling consists of the personal ministry of the Word, we must ask who is responsible for the ministry of God’s Word. Scripture lays out two distinct groups of people who are responsible for this task. First, God has provided pastors/elders to the church to teach and preach the Word (Ephesians 4:11; 1 Timothy 3:1-2). These are men who are uniquely gifted, called, qualified, and affirmed by the church as capable for serving in the position of a teacher and overseer. This position comes with the sober responsibility of caring for the souls in their church (Hebrews 13:17), and they will be held to a higher standard as teachers (James 3:1).
Second, Scripture speaks of the responsibility for all believers to care for and minister God’s Word to one another. In Romans 15:14, Paul encourages the Roman believers to be “filled with all knowledge and able to instruct one another.” Elsewhere, Paul instructs believers to “admonish the idle, encourage the fainthearted, help the weak, be patient with them all” (1 Thessalonians 5:14). And there are many other “one another” verses that demonstrate how all believers ought to care for and minister the Word to one another.3For more, see Stuart Scott and Andrew Jin’s “31 Ways to be a “one-Another” Christian: Loving Others with the Love of Jesus.”
It is not enough to simply have someone teach God’s Word—Scripture lays out who should teach in the church and when. This means we should be thoughtful about where people are going to receive counsel from God’s Word. Is it wise for AI chatbots to be the source of our session agendas and case plans when the responsibility for the ministry of God’s Word rests on pastors and the saints? Tools like ChatGPT are essentially an accumulation of information (and potential misinformation) on the internet, with the biases of their very human programmers. We have no reason to believe the programmers of such tools intentionally coded a biblical bias into the program. So, it is reasonable to consider that ChatGPT falls outside the bounds of authority for the ministry of God’s Word as it is neither a pastor nor a regenerate believer (nor a person at all!), and it is programmed with a decidedly secular bias. Therefore, biblical counselors ought to be wary, on the basis of authority, about using AI in the creation of their counseling content.
Consideration #2: Authenticity
If a biblical counselor is relying on ChatGPT to help create case plans and session agendas, the issue of authority is not the only concern that needs attention. The second consideration is authenticity—is using AI to create your own content simply plagiarism? This is a fair question and one that is often asked in educational institutions. It is difficult to give a straightforward answer to this question, and different institutions have come to different conclusions. For example, Princeton University permits AI use in students’ coursework under some circumstances, such as if the professor allows it and if the student discloses that AI was used in the assignment.4“Rights, Rules, Responsibilities,” Princeton.edu, https://rrr.princeton.edu/students-and-university/24-academic-regulations.
A biblical counseling ministry, however, is not an institution for higher education. Regardless, the issues of honesty and integrity are still of utmost importance. If a pastor stands up in the pulpit to preach on a Sunday morning, and then he begins reading off a sermon manuscript written by an AI chatbot, his congregation should get up and walk out because it is the pastor who bears the responsibility for teaching and shepherding the flock (John 21:15; 2 Timothy 4:2; 1 Peter 5:1-2). We should not hold the personal ministry of God’s Word in counseling to a lower standard of integrity, settling for words of counsel typed up by a chatbot, even if it does sound consistent with Scripture. The counselor is someone who must be qualified to teach from God’s Word, but also someone who is trustworthy and has integrity about the source of the counsel offered.
In Proverbs 25:11-13, Solomon describes a timely word and a wise reproof as blessings to the hearer, like “apples of gold in a setting of silver” (verse 11). But in verse 14, he warns that a man who boasts of a gift he does not have is like rain clouds that yield no rain. A biblical counselor who has a timely (and biblical!) word blesses the counselee. However, one who is dependent on an AI chatbot to generate counsel will inevitably be like rain clouds that fail to water the earth. The counselee believes he is coming to the counselor to hear the counsel of God’s Word, but what he receives may in fact be untrustworthy and unbiblical information generated with AI. This exposes the concern of authenticity: does the biblical counselor’s use of AI inadvertently lead to mixed streams of counsel, not the pure water of the Word, and does it do so without the counselee’s awareness?
Consideration #3: Application
The concern of authenticity, specifically honesty and integrity, is central to the discussion of using AI in counseling. And for the biblical counselor, the issue of authority is of utmost importance in establishing the legitimacy of every counseling practice. But the third consideration requiring our attention is application. I want to offer two practical thoughts to set our trajectory when discerning how or if you should use AI in biblical counseling.
The first point of application is keeping the aim of biblical counseling before us. Our aim is not to provide symptom relief or a therapeutic experience; ultimately, the goal of biblical counseling is sanctification. In 1 Thessalonians 4:3a, Paul succinctly writes, “This is the will of God, your sanctification.” To Timothy, Paul said, “All Scripture is breathed out by God and profitable for teaching, for reproof, for correction, and for training in righteousness, that the man of God may be complete, equipped for every good work” (2 Timothy 3:16-17). The goal for a biblical counseling ministry is always the salvation of souls—which means sanctification in the life of the believer. And according to 2 Timothy 3:16, sanctification comes by the sufficient Word of God. Therefore, the biblical counselor’s unwavering focus must be on listening to what God has said, not what ChatGPT says. It requires more effort to digest God’s Word than to ingest a dialogue with a chatbot, but the difference is that only one of those is the bread of life leading to godliness (see Matthew 4:4, 2 Peter 1:3).
Second, it is incumbent upon the biblical counselor to rightly handle the Word of truth (2 Timothy 4). Yes, using ChatGPT saves time, enables us to find information we otherwise would not have found, and challenges us to think more logically; so, for these reasons, we flock to AI. However, being able to use ChatGPT is not what qualifies someone to minister God’s Word. If you are relying on AI to create case plans, session agendas, or find quick-fix answers in a session, honestly evaluate whether it is because of a lack of ability to handle the Word of Truth. All of us are imperfect and must grow in our abilities as counselors, but turning to AI does not aid in this pursuit, and in fact, it may be a detriment.
Using AI gives the illusion in our final product that we are more capable than we truly are; yet it may actually be hampering you. One study demonstrated that participants who used chatbots like ChatGPT to write essays found the task easier to complete, but their memory, learning, and critical thinking abilities were negatively impacted.5Peter Simons, “ChatGPT Weakens Your Ability to Think, MIT Study Finds,” Madinamerica.com, July 5, 2025, https://www.madinamerica.com/2025/07/chatgpt-weakens-your-ability-to-think-mit-study-finds/. This study corroborates what many of us have already thought: AI tools like ChatGPT serve as a crutch. It makes tasks easier, but it does not help us develop skills to handle these tasks. We should be wary about trying to take shortcuts in the counseling process. Yes, AI may be the fastest way to create a case plan or find the right words to explain a biblical concept, but it may also become a crutch that we rely on instead of exercising our skills in discernment and rightly handling God’s Word, distracting us from the goal of sanctification.
Should I Use AI in Biblical Counseling?
“‘All things are lawful,’ but not all things are helpful. ‘All things are lawful,’ but not all things build up. Let no one seek his own good, but the good of his neighbor” (1 Corinthians 10:23-24). The availability and prevalence of AI tools such as ChatGPT require careful analysis. At the end of the day, these chatbots are tools that can be used for harm or for good. There are clearly unethical uses of AI (such as using these tools to plagiarize), but there are many possible uses of these tools that require us to apply wisdom and discernment. However, remember that “all things are lawful” is not our standard. Instead, we strive to build up, seeking the good of our neighbor. But a real threat remains in the undiscerning use of AI tools for biblical counseling as they can short-circuit the biblical role of the counselor and shortchange the counselee in the process.
The appeal of these AI tools is that they can increase productivity, efficiency, and quality in our work; and since we want to excel as counselors, we can be tempted to use AI to achieve these ends. But true power and wisdom for the cure and care of souls are only found in the person of Christ (1 Corinthians 1:24). Believers are dependent on God’s power, the Spirit, not human invention, for the spiritual work of soul care. So, Christians, be faithful to proclaim Christ in the counseling room so that we may present everyone mature in Christ (Colossians 1:28-29); and we strive for this end, the sanctification of our counselees, with God’s energy that He powerfully works in us