The missing piece: Technology and mental health

In the mental health field, divisions are growing over a hot topic: the role of technology and artificial intelligence in treatment.
There is a common perception that therapists in the tech world are resistant to change or anti-innovation. We are here to set the record straight.
If you're building these tools, investing in them, or want to create the next great thing that will change your industry, we've got some inside information for you.
Therapists and mental health professionals are not afraid of technological advancement, but we have reason to worry about the consequences of technological advancement being established without us.
Therapy is not just another industry in need of modernization.
Therapists face a crossroads, often divided over whether artificial intelligence in mental health will exacerbate burnout and workforce turnover issues or ultimately support clinicians and enhance care.
In our experience, the greatest indicator of the success or safety of a new tool for therapy is whether experienced therapists were involved in its development, ensuring that compliance and safety standards for which we are accountable are introduced from the conception of the product and not just as testers before release. We need tools that match who we are: independent, autonomous and diverse.
If technology reshapes treatments without understanding them, we risk solving the wrong problems or, even worse, creating new ones.
The hardest part of treatment isn’t what you think
When we start practicing, we do everything right.
We researched the best tools, interviewed experts, hired exceptional therapists, and hired a strong administrative support team. We should have done well but we got stuck.
We have the same problem again, threatening to shut us down and disrupt our customers. This is not clinician burnout, client trauma, or overwhelming emotional work. This is unpaid administrative labor.
It turns out that this isn’t just our problem, it’s a problem across the healthcare industry. Approximately 40% of the tasks required to ethically and safely care for clients are unpaid, non-reimbursable and deemed non-essential by the industry.
However, the current market views the ongoing mental health crisis across the country as a productivity or workflow issue.
According to the National Bureau of Labor Statistics, 54 percent of graduates with a master's degree in mental health counseling never obtain a license, despite spending up to seven years in school and completing several years of graduate study under rigorous supervision.
We sincerely believe that many of the people building mental health technology have good intentions. They want to help.
However, most technology tools are designed with the payer or consumer in mind, which creates a gap that is not only inconvenient, but also very unsafe.
We’re seeing a pattern in the industry where tools either fail to roll out, frustrate clinicians hoping to adopt them, or end up disappointing the customers they’re trying to reach. What many of these teams have in common is that experienced professionals are not really involved in building the solution.
When you look at it, how many of these companies that promise to solve the mental health crisis involve real therapists who have been on the front lines working with clients, who can tell you what they need to successfully treat clients?
Many people may have a leadership provider or two, but many, if not most, have decided that they know what professionals want and need without talking to professionals themselves. Others hire therapists for feedback, not development. Experts are brought in last, not first.
To solve this problem, experience and training from experts in the field are considered optional.
Risk, or the things we can’t make mistakes
If there's one thing we can guarantee happens in every treatment, every day, across the country, it's this: Clinicians are constantly assessing the safety of their clients.
Most importantly, our responsibility and mission is to keep our customers safe. This means not only safety in the moment, but also safe physical and emotional relationships, safe boundaries, and safe experiences.
Information protection is also important. Neither the company nor the technology can be held responsible for one breach or one mistake. This is my license, my profession, and ultimately my responsibility.
Therapists are responsible for what we can and cannot control, what we know and what we should know to prevent harm, even if we cannot control the techniques we use.
Do you know of any other industries that adhere to this standard?
Other professional fields, such as medicine or aviation, also face serious personal liability, but in most cases these professions have complete systems in place to spread the risk. In mental health, therapists often yes The System: Clinicians, Risk Managers, Compliance Officers, and Confidential Gatekeepers while taking personal responsibility for tools they did not build or control.
When technology fails, the responsibility does not fall on the software, the company, or the people who built it.
In mental health, the therapist alone holds this responsibility.
What kind of future will we create when the voice of the therapist emerges?
Ultimately, what keeps us up at night is this: Once these tools are refined to reduce costs, could they ultimately be used to determine that some people simply don't deserve treatment?
Somewhere in the boardroom, technology leaders are discussing whether chatbots can replace human therapy. This is not new, and most facilities already have processes in place to identify individuals who are “not symptomatic enough” to justify the cost of care. Viewing this as cost savings rather than prioritizing patient care not only asks the wrong questions but announces solutions based on hazard assessments that can have significant real-life consequences.
By involving experienced therapists and mental health management experts from the outset of new technology development, we not only build more effective tools but also build trust and confidence among clinicians. This early investment allows us to proactively address issues around safety, compliance and liability, ensuring technology meets real-world needs.
By doing so, we can both provide therapists with tools to support their practice and prevent dangerous errors in otherwise well-intentioned techniques.
When our therapists say this is safe, this is trustworthy, this is useful, then we have marketable technology that therapists want to implement into their practice.
We’re not saying don’t build it, we’re saying build it with us.
Therapists are not resistant to modernization, we are guardians of safety and trust. We want the tools to move us forward, and we want to shape them. If the future of treatment is technology that protects clinicians’ time, voice, and autonomy, then we all win.
Photo credit: Olga Strelnikova, Getty Images
Kira Torre, LMFT, and Emily Daubenmire, CPC, co-founders of a mental health group practice, have a simple mission: Prioritizing therapists means prioritizing patient care. Working at the intersection of clinical practice, operational leadership and digital health innovation, they bring a unique perspective to the next generation of mental health care and work together to advocate for ethical technology in behavioral health.
This article appeared in Medical City Influencers program. Anyone can share their thoughts on healthcare business and innovation on MedCity News through MedCity Influencers. Click here to learn how.



