Dystopian science fiction movies are always fun, in part because they exist just on the edge of what is possible. Movies like Blade Runner, Ex Machina, and Her portray a future that seems not so far from our own, a future where people are bossing realistic AI robots around and falling in love with them. They make AI look dangerous but enticing—as though if society could only thread the needle and avoid total annihilation, then the earth’s population’s true potential as a species would be unlocked.
This was the kind of thing we could daydream about until November 2022, when OpenAI launched ChatGPT4, and it started to become true. But AI didn’t just take over our dirty jobs like Rick Deckard in Blade Runner or Samantha in Her; it took over the arts. The Writers Guild of America and SAG-AFTRA both went on strike in 2023 to combat AI taking over their jobs. And now, in another twist of fate that no one saw coming (or ever really wanted), developers are creating AI psychotherapists.
I sat down with Jordan Conrad, a psychotherapist and the founder and clinical director of Madison Park Psychotherapy in New York City, to discuss whether this is just hype or if we can look forward to AI therapists.

Dr. Conrad is just the right person to talk to. “My PhD is actually in philosophy,” he tells me as we sit down. He has those friendly-but-serious eyes that therapists sometimes have and talks in a calming tone that makes it easy to listen to him, even as our conversation moves to harder topics. “The word ‘crisis’ gets thrown around a lot these days, but there is a real problem in mental health worldwide,” he says. “AI has a lot of promise: to reach people in geographically isolated places or those where there just aren’t many therapists.”
The problem is whether AI provides a therapy worth wanting. Many AI psychotherapy developers claim to have high-tech features like mood trackers that can predict psychotic episodes before they even happen, but none of these are currently on the market. “Worse yet,” he explains, “most mental health apps have shoddy safety information or, in some cases, none at all.” It is almost hard to believe, after decades of science fiction, that safety wasn’t a top priority, but then I remember: greed. Ah, right…
I ask whether the safety problems can be managed by using one of these apps alongside a normal human therapist. “That might help reduce some of the problems, but professional organizations haven’t provided much guidance on how to incorporate these into our normal practice.”
Dr. Conrad is right. BetterHelp was fined $7.8 million by the Federal Trade Commission for disclosing users’ therapy histories to Facebook, revealing Snapchat email addresses of former users, and disclosing visitors’ email addresses to Criteo and Pinterest. Dr. Conrad points out that “the FTC penalized BetterHelp not because it violated HIPAA” (the Health Insurance Portability and Accountability Act, which protects patients from this kind of thing). “BetterHelp isn’t even covered by HIPAA. It was fined because of false advertising.”
“The fact that the FTC is in charge of this just shows that we don’t yet have the kind of legal apparatus we need in place to regulate AI psychotherapy,” he says.
I confess to him that this seems worrisome… for now. We’re talking about billions of dollars and some very smart people—won’t they figure out the regulatory oversight stuff at some point? “Of course. The practical issues will get ironed out. The problems with using AI for psychotherapy are much deeper.”
It is hard not to feel like we are in the Pelican Brief, discussing important state secrets, as we huddle over the table at a café in midtown. “Philosophers, computer scientists, and neuroscientists have come up against what appears to be an insurmountable issue—they call it ‘the Hard Problem.’” Dr. Conrad explains that, try as they might, they haven’t been able to program feelings into machines. And it’s not that developers haven’t cracked the code yet—it’s that it might not be possible to crack. “There just is no way to derive qualitative experience from purely physical stuff,” he says. “We know that it happens with human brains somehow, but we don’t know how, so we don’t know how to replicate it in our machines. What we do know for certain, though, is that LLMs [large language models] don’t have experiences; they aren’t designed for that kind of thing.”
This creates two problems: first, most psychotherapy apps use some form of cognitive behavioral therapy (CBT)—an evidence-based model shown to be highly effective… some of the time. “There is no inherent problem with CBT; it’s just that all outcome studies show some sizeable proportion of people who aren’t helped by the treatment. This is true for CBT just the same as it is for any medication.” For Dr. Conrad, who combines CBT and psychodynamic psychotherapy, programming these apps with only one psychotherapeutic modality is worrisome. “If the advantage of AI psychotherapists is that they can reach treatment deserts, then the fact that they predominantly use CBT means that those conditions CBT cannot treat will disproportionately occur in those areas. That means that wealthy people with access to other treatments won’t have certain conditions that reliably show up in economically disadvantaged places—if anything could widen the already too-large wealth inequality gap, this could.”
The other problem comes from the user side of things. “Who would want to see a therapist who isn’t able to think or feel anything at all?” he asks me. “Imagine going to a trauma support group and finding out an hour into the session that everyone else there is AI!” I look around the café with a newfound suspicion. Dr. Conrad laughs. “We’re not there yet.” I’m not reassured.
Jordan Conrad, PhD, LCSW, is a researcher and psychotherapist in NYC, as well as the founder and clinical director of Madison Park Psychotherapy in Manhattan. His research has appeared in the Journal of Medical Ethics, the Journal of Contemporary Psychotherapy, the American Journal of Psychotherapy, and many other leading journals in the field, as well as in several Oxford Handbooks. You can find Madison Park Psychotherapy here.
The Paste editorial staff was not involved in the creation of this content.