Please ensure Javascript is enabled for purposes of website accessibility Testifiers support legislation to regulate AI use in counseling services

Testifiers support legislation to regulate AI use in counseling services

H.B. 2100 topic of discussion during Tuesday House Majority Policy hearing

HARRISBURG, Feb. 24 – As the use of artificial intelligence rapidly increases, so does the need to protect users, especially those who are vulnerable. At a House Majority Policy hearing Tuesday morning, lawmakers and testifiers discussed the need to implement safeguards for AI use in counseling services across Pennsylvania.

“I’ve heard from countless community members about the use of AI in spaces that is not appropriate, and in some cases, deceptive. People seeking counseling are at their most vulnerable, and they deserve real, human treatment,” said Rep. Jennifer O’Mara (D-Delaware), who co-hosted Tuesday’s hearing. “With technology continuing to advance, it’s important we act now to make sure we’re utilizing AI safely, especially when it comes to mental health treatment.”

Testifiers say AI systems are built for engagement and rapport-building, often through highly affirming language. This creates risks in mental health contexts as individuals may receive validation without clinical judgement or appropriate escalation. Licensed professional counselors have proper education and training, provide informed consent, assess risk, and intervene. AI systems and mental health chatbots do none of these, and they lack authentic empathy, comprehensive understanding, and the clinical expertise necessary for addressing complex mental health concerns. Yet, they’re still being marketed and used as emotional support.

“It’s clear that individuals will continue to confide in AI systems and seek relationship advice or personal reflection. However, there must be a clear regulatory distinction between general advice platforms and licensed counseling,” said Dr. Curtis Taylor, a Licensed Professional Counselor in Erie, PA. “AI systems must not market or brand themselves as mental health providers, partner with insurance companies as therapeutic alternatives, or present themselves as substitutes for licensed care.”

Professional counselors are considered mandated reporters, someone legally required to report suspected or known abuse of children or vulnerable adults. AI tools do not hold this mandated reporter status and can’t be held accountable for failure to report imminent risk. If a counselor were to use AI in treatment and rely on it for treatment, they encounter the risk of missing significant nuances during treatment that may implicate mandated reporting and thus put themselves at risk of legal sanctions for failure to report.

House Bill 2100, sponsored by O’Mara and House Majority Policy Chairman Ryan Bizzarro, would provide safeguards to protect client welfare and regulate counselors’ use of AI in providing services, ensuring that a human continues to be the primary provider of counseling services.

“There’s no way to replace the evidence-based care that licensed professional counselors provide to patients, but testimony shows AI and mental health chatbots are being used more for these services,” said Bizzarro, who represents portions of Erie County. “This technology can end up causing more harm than good, which is why we need to establish a clear and unified standard for the use of artificial intelligence in counseling services and protect patients.”

Tuesday’s hearing was held in the Pennsylvania Capitol Complex and featured testimony from Dr. Curtis Taylor, a Licensed Professional Counselor in Erie, PA; Dr. Madeleine Stevens, Government Relations Committee Chair for the Pennsylvania Counseling Association; and Dr. Molly Cowan, Director of Professional Affairs for the Pennsylvania Psychological Association. Testimony submitted for Tuesday’s hearing can be found here.

A livestream of Tuesday’s hearing can be found here.

Information about this hearing and other House Democratic Policy Committee hearings can be found at pahouse.com/policy.