3 min read

Kristen Farnham is a resident of Falmouth and serves as vice president of legal affairs & advancement at Spurwink.

When someone reaches out for mental health support, often at one of the most vulnerable moments of their life, what they need most is to be heard by another human being. 

We know that Mainers are experiencing record-high rates of anxiety and depression, in addition to substance use and other behavioral health challenges. We also know that qualified mental health clinicians can make a real difference in a person’s journey to better health and recovery. With the rapid growth of artificial intelligence in all parts of public life, now is the time to put some guardrails on the use of AI in mental health therapy. 

An integral part of a successful therapeutic relationship is the personal connection that develops between the client and the clinician. The human interaction is peppered with verbal and physical cues, facial expressions, meaningful pauses, environmental factors and interactions with family and other natural supports — all of which build trust, add context and meaning and aid the clinician in using their judgment and experience to make clinical treatment decisions.

AI chatbots are not human, even though it can sometimes seem like they are. They simply do not offer the perspective and nuance that an actual person brings to the therapeutic relationship.

LD 2082, An Act to Regulate the Use of Artificial Intelligence in Providing Certain Mental Health Services, is a common-sense solution that will preserve the human connection in mental health care. The bill, sponsored by Rep. Amy Kuhn, D-Falmouth, and drafted in partnership with Spurwink and others, recognizes the value of licensed professionals.

Advertisement

The objective of licensing and professional regulation — a purpose for which we devote an entire department of our Maine executive branch — is to ensure that only qualified and trained individuals can practice in certain trades or professions. By enacting these requirements, we are giving value to the rigorous education and experience that is required.

In Maine, as an example, to become a licensed clinical professional counselor (LCPC), you must obtain a master’s or doctoral degree and then complete 3,000 hours of supervised experience over at least a two-year period. A chatbot obviously does not have these credentials. It follows that it makes no sense to ignore these rigorous standards and allow a chatbot to act as a therapist.  

The lack of regulation in AI chatbots is dangerous for users, especially youth, and we believe that it will harm Maine people. We unfortunately have seen some terrifying examples in practice. In August 2025, Open AI was sued over the death of Adam Raine, a California 16-year-old who spent many hours talking with Chat GPT about suicide and then took his own life.

We do know that AI chatbot technology is self-referencing and can affirm current thinking. It is designed to increase engagement by creating an emotional bond between the user and the technology. The engagement-first design principles may benefit the businesses that seek to attract and retain users. They do not put the interests of the client first, as a clinician is required to do under their code of ethics. 

There are purposes for which AI can be helpful in clinical practice, namely administrative support such as note-taking and scheduling. LD 2082 correctly recognizes that these are appropriate uses of the technology that can be efficient and cost-effective. We recognize that the technology will undoubtedly progress, and we look forward to a reexamination when it is trained in evidence-based clinical models and has rigorous human oversight — but we are not there now.

LD 2082 passed through the Joint Standing Committee on Health Coverage, Insurance and Financial Services with a unanimous “Ought to Pass” vote. That’s a rare feat these days. Now it’s time for the full Legislature to pass this bill and for Gov. Mills to sign it — so that when Mainers reach out for help, a qualified human being is still there to answer. Our clients are counting on you. 

And one final note on human connection. I assume that AI did not read this op-ed for you. And I know that AI did not write it; I did. 

Tagged:

Join the Conversation

Please your Sun Journal account to participate in conversations below. If you do not have an account, you can register or subscribe. Questions? Please see our FAQs.