By Dr. Madeline Goodman, Psychiatrist & Psychotherapist
In the past few years, artificial intelligence (AI) has moved from a buzzword to a powerful presence in healthcare—and psychiatry is no exception. From chatbots offering mental health support to algorithms screening for depression, AI tools are being hailed as the future. But as someone who practices both psychiatry and psychotherapy, I see both the potential and the limits of AI in mental health care.
What AI Can Do in Psychiatry
AI can be remarkably helpful in specific ways:
- Screening and early detection: Algorithms can flag symptoms of anxiety, depression, or even psychosis by analyzing speech patterns, facial expressions, or written language.
- Support tools: Chatbots and apps like Woebot offer basic cognitive behavioral techniques, which some people find helpful between therapy sessions.
- Administrative support: AI can assist clinicians by summarizing notes, organizing data, and predicting medication responses—reducing our time spent on paperwork.
- Access expansion: In underserved areas, AI-powered platforms can provide some mental health support where human clinicians are scarce.
These are all meaningful contributions, especially in a system where access and time are limited. But psychiatry is more than pattern recognition.
What AI Can’t Replace
While AI might be efficient, it lacks emotional depth. Psychiatry is not just about diagnosing and prescribing—it’s about understanding the human story:
- The therapeutic relationship: Healing often happens in the relationship itself—through being seen, heard, and understood by another person. This is something no algorithm can replicate.
- Complex emotions and nuance: Human distress is rarely neat. A person may present with anxiety that is also grief, that is also trauma, that is also existential. Understanding that layering takes attunement, not coding.
- Unconscious processes: As someone trained in psychodynamic therapy and dream work, I believe that unconscious patterns shape behavior in ways AI can’t access. Therapy often means sitting with uncertainty and ambiguity—territory machines aren’t built for.
How I Use (and Don’t Use) Technology in My Practice
I value useful tools. I offer telehealth, I keep up with emerging technologies, and I understand the appeal of convenience. But I’ve also seen what’s lost when care becomes too transactional.
My practice offers something different: personalized, attentive, and deeply human care. Whether you’re navigating midlife transitions, relationship changes, or long-standing patterns, I create a space for exploration that goes beyond symptoms.
The Future: Human + AI, Not Human vs. AI
I’m not anti-AI—I think it has a role. But I believe the future of psychiatry is not about replacing clinicians. It’s about supporting them, so we can focus on what machines can’t do: build relationships, tolerate uncertainty, and sit with pain compassionately.
If you’re looking for mental health care that combines depth, professionalism, and a human touch, you’re in the right place.