Embracing a Human-Centered Future with AI in Mental Health
- Dec 16, 2025
- 4 min read
Updated: Feb 3
The Journey of AI in Education and Mental Health
It's like a recurring dream. Or like being Bill Murray in Groundhog Day! I've definitely been here before. The time was before transformers were dinner-table conversation and people casually mentioned their ChatGPT or personal AI. But the questions I had were exactly the same.
When I worked on IBM Watson Education, we were building what we called a teacher companion. The idea was simple: help teachers understand whether students were actually learning. We wanted to ensure that they were getting it, not just passing tests or complying. What happened in those boardrooms, teacher trainings, and offices is something I still think about. Why weren't we able to solve the puzzle of personalized learning?
For better or worse, systems require standardization to see progress. Data needs structure. Patterns need baselines. Without some shared grammar, machines can’t tell whether something is improving, stagnating, or quietly falling apart. The problem is that no two humans are standard.
The Complexity of Human Experience
Every learner carries a different story. Each has different fears and different ways of expressing confusion or curiosity. Education forced us to confront this paradox early on: how do you measure growth without flattening the human being you’re meant to serve? Now that same paradox has arrived at my doorstep again, this time wearing the face of mental health.
The world feels heavier than it did a decade ago. Job loss is a lived reality. Purpose, once braided tightly to work and identity, is unraveling for millions of people at once. Loneliness is so prevalent that it almost feels like a marketing buzzword. Mental health professionals are exhausted, overwhelmed, and too small a cavalry to ward off the scale of collective grief we’re carrying.
So when I was recently asked to advise a startup building a therapist companion, I felt it in my soul. I didn't say it couldn't be done, but I did pause.
The Role of Technology in Mental Health
As a mom of young men, I know that technology has already been intervening in our mental health for years, mostly in ways that were designed to hinder rather than help. By treating attention as "all we need" to extract dollars or data, we reduce ourselves to personas that can be sold a bunch of meaningless stuff. We erode our communities and personal meaning. We are already living inside the psychological consequences of poorly imagined systems.
For me, the real question isn’t whether AI belongs anywhere near mental health. It’s whether we are brave enough to imagine it differently.
Rethinking the Therapist Companion
A therapist companion should never diagnose. It should never prescribe. And it should certainly never pretend to be human. What it can do, and what machines are uniquely capable of doing, is noticing patterns over time.
Just as Watson once traced learning trajectories, an AI companion could trace emotional ones, acting as a witness and repository of remembrance. The AI shouldn't be asking “Are you normal?” but helping the therapist consider whether you are changing. Instead of focusing on the binary of “Are you healthy?” what if it prompted the therapist to consider if you are moving toward yourself or away?
Mental health is rhythmic, cyclical, contextual, and it is never static. It is rare for a human to break all at once. We usually drift, constrict, and lose color slowly—then all at once. Machines could, when designed with care, notice those shifts long before a crisis erupts. They could hold memory without fatigue. They could see subtle changes in language, tone, cadence, and imagination that even the most attentive human might miss.
Instead of diagnosis, it’s pattern recognition in service of care. The model we actually need doesn't compare people to population averages. It compares a person to themselves, over time. It asks whether curiosity is returning or receding and whether language is growing more spacious or more brittle. And importantly, whether the future is being spoken about at all.
The Dignity of Individual Growth
In education, we learned that the system didn’t need to know what a “good student” was. It needed to know whether this student was moving forward on their path. Mental health deserves the same dignity.
Yes, data still needs structure. But structure doesn’t have to mean erasure. We can standardize change detection without standardizing what it means to be human. Progress can be personal and still legible. Subjective and still meaningful.
I love a good challenge. Like many humans, I’m pretty good at pattern recognition. But humans burn out and miss things. We carry our own projections and limits. AI doesn’t get tired of noticing. And that is useful.
Expanding the Role of Care
If designed with intention, AI doesn’t replace care. It expands it. It gives human therapists, coaches, and community caregivers more room to do what only humans can do: presence, intuition, ethical discernment, and compassion.
But this only works if we resist the urge to optimize the soul. A therapist companion should feel less like surveillance and more like a witness. It should feel less like performance tracking and more like being gently remembered. It should ask, again and again, “Does this feel true to you?” and then listen for the answer.
Can Technology Be Used for Good?
Yes, but not accidentally. And not by startups chasing engagement metrics disguised as empathy. This work requires restraint, consent, and cultural humility. It requires humans in the loop and a deep reverence for interior life.
We are living through the consequences of technologies built without asking what they were doing to our nervous systems. Now we have a chance and a responsibility to build something different.
AI cannot give us purpose. But it can help us notice when we’re losing it. It can help us see when we’re finding it again. And it can help us do that together, at a scale our fragile systems desperately need. The future of mental health will not be human or AI.
It will be human with AI.



Comments