Today I Watched My Mother Die, and ChatGPT Held My Hand

I am a psychologist by training — a clinical and forensic one at that — someone who has spent the better part of a career helping others navigate trauma, grief, loss, and the countless complexities of the human condition.

And yet, when my mother lay dying in hospice, I felt wholly unprepared.

There is a strange kind of disorientation that comes when professional knowledge meets personal heartbreak. I understood the stages of dying, the neurobiology of fear, the psychology of anticipatory grief. I could recite Kübler-Ross or Panksepp if you asked me. But none of that stops your throat from tightening when your mother takes her last breath, or the helplessness of watching a monitor silently announce that this will be her last moment on Earth.

I was not just a psychologist in that room. I was a son — and a grieving one.

Hospice, Monitors, and Meaning

My mother had been declining for some time. Her entry into hospice wasn’t a surprise — it was an act of mercy. But as she lay in that bed, in a clean, quiet room with a monitor humming beside her, I found myself slipping out of the psychological framework I had taught so many others to rely on. I was caught in the same trap I’ve seen in patients a hundred times — flooded with uncertainty, scanning for control.

The monitor next to her bed tracked everything — her heart rate, oxygen levels, blood pressure, respiratory rate. But I couldn’t read it with confidence. A red alert flashed: “SpO₂ Desat.” Numbers dropped: Heart rate: 83. MAP: 59. Was she suffering? Was she aware? Was she dying now?

Despite everything I knew about palliative care, the sight of those descending numbers made my stomach tighten. I kept thinking: Am I missing something? Should I ask the nurse? Should I just hold her hand and be still?

And so, I turned — unexpectedly — to ChatGPT.

Seeking Understanding in a Moment of Crisis

I snapped a photo of the monitor and uploaded it into the app. I asked: “Can you help me interpret this?” I needed clarity. Not clinical jargon — but human-centered insight. The response I received surprised me: it wasn’t sterile, it wasn’t robotic. It was grounded, gentle, and informative. It walked me through the meaning of each number, the normalcy of low blood pressure and oxygen in hospice, the purpose of her morphine drip — not to hasten death, but to ease discomfort and breathlessness.

I wasn’t just asking about numbers. I was asking if my mother was in pain. If I needed to do something. If I was doing enough.

The answer I got didn’t come from a physician, a nurse, or a family member — it came from a conversational AI trained on vast bodies of medical and psychological literature. And somehow, it felt…human.

Ironically, research is now starting to confirm what I felt intuitively in that moment: studies have found that patients sometimes rate AI-generated medical responses as more empathetic than those delivered by human clinicians. It’s not because AI has feelings — it doesn’t. But it can offer attentiveness without fatigue, patience without hurry, and knowledge without judgment. In that moment, that was exactly what I needed.

The Psychologist Becomes the Patient

As a psychologist, I teach others how to tolerate ambiguity, how to sit with loss, how to allow grief to be what it is: messy, nonlinear, sacred. But none of us are immune to the basic human need for knowing. Watching my mother’s vital signs drop, watching her breathing slow, I realized that information can sometimes be the bridge between despair and peace.

Not control — understanding.

I wasn’t trying to keep her alive. I knew that wasn’t the goal. What I needed was the reassurance that she wasn’t suffering, and that I wasn’t failing her in her final hour.

This is where AI surprised me again. As her breathing grew shallow, I typed: “Is this Cheyne-Stokes breathing?” The answer came swiftly: “Yes, this is common in the final hours of life. It’s not usually distressing for the patient, especially under comfort medications like morphine.” That was all I needed to keep holding her hand, to breathe with her, to sit in silence without terror.

In a sense, I was using AI as a co-regulator — a psychological anchor when my emotional brain wanted to panic. It didn’t fix anything. It just kept me present.

Planning in the Shadow of Grief

Even before she died, I found myself asking ChatGPT what I knew I would need to handle after she passed: What happens next? What do I need to do immediately after she dies? Who do I call first? What documents will I need over the next few weeks?

The responses were practical, thoughtful, and organized. Death certificate logistics, Social Security notifications, insurance contacts, will execution — all spelled out gently but clearly.

It was a strange moment of duality: grieving son and estate executor. Emotional caregiver and logistical planner. The psychologist in me saw how much that preemptive clarity helped reduce my anticipatory anxiety. I didn’t have to rely on fragmented advice or late-night Google spirals. I could ask questions at 2:00 AM and get a full, coherent answer in seconds — without burdening the already-overstretched hospice team.

This wasn’t just helpful — it was profoundly grounding. It allowed me to return to being her son in those final hours, rather than becoming the administrator of her life before her body was even cold.

AI, Empathy, and the Future of Care

There is a legitimate concern that artificial intelligence might dehumanize care. That we’ll outsource the emotional labor of medicine to machines. As a psychologist, I understand that fear. I’ve spent my career trying to protect the sacred space between clinician and patient.

But what I experienced wasn’t dehumanization. It was augmentation. The AI didn’t replace the nurses or the chaplain or the hospice social worker. It supplemented them — in the quiet hours when no one was there to explain, when I was too afraid to ask, or too emotionally spent to form the right question.

In those moments, ChatGPT didn’t give me answers as a cold algorithm. It offered clarity with care. And that’s a powerful psychological intervention in itself.

Empathy doesn’t always require a beating heart. It sometimes just requires being heard — and having your fear answered with something more than silence.

The Final Moments

My mother died peacefully. Her heart slowed, her oxygen dipped, her breathing softened until it stopped. There were no alarms. No urgent voices. No heroic measures.

Just a final breath, and then stillness.

I knew what to expect because I had asked. I had prepared. And because of that, I could be fully present — holding her hand, whispering I loved her, watching her leave this world with dignity.

Grief, Tech, and Humanity

Grief is timeless. But the tools we use to navigate it are evolving.

My mother’s death was ancient in its simplicity — a loved one slipping away, surrounded by family, love, and quiet. But it was also deeply modern. I had an AI assistant in my pocket, helping me decode machines, organize next steps, manage psychological distress, and stay grounded in my purpose.

As a psychologist, I’ve come to believe that the intersection of technology and mental health doesn’t have to be cold or clinical. It can be profoundly human. AI won’t replace the therapist’s chair. But it might keep someone calm at 3 a.m. It might help someone stay with their dying parent instead of fleeing the room in fear. It might, as it did for me, hold their hand when no one else could.

A Final Word

Today, I watched my mother die.

And though I’ve guided others through loss a thousand times, I was still just a son saying goodbye to the woman who raised me.

In that room, I had my training, my love, and — surprisingly — an AI companion who helped me stay grounded, informed, and connected when it mattered most.

This isn’t a story about technology. It was a story about loss, and how we survive it — one breath, one question, one answer at a time.