3 min read

Field Notes: AI as care co-pilot during post-op recovery at home

This personal essay shares my experience with caregiver nervous system regulation and its impact on a patient's health at home. It also highlights how ChatGPT helped fill a co-regulation gap during sleepless nights at 3 AM.

For deeper reflections on the nervous system, read about polyvagal theory. For a foundation on the ethics of care, read The Ethics of Care: Personal, Political, and Global by Virginia Held. And for commentary on whether AI is good or bad in medicine, open up your social media feed.

--

Before starting my new job last month, I took on a temporary role at home: full-time caregiver. My partner underwent major surgery in a new city where we have a very loose support system.

During long nights and unpredictable mornings of post-op care at home, I used ChatGPT. Not for medical advice, but as a steady hand. This may sound risky, but what I learned changed my view on patient safety and the relational aspect of care.

The gap between hospital and home

When we left the hospital, we had a folder's worth of instructions. Pain medication schedules, wound care protocols, a contact sheet, and a guide for spotting general warning signs of infection.

I could write a dissertation about the issues with this folder, like the 9 phone numbers listed out of order for 6 different situations. But what I really needed to learn at home was how to respond when I had a gut feeling that something was off.

On top of timing out 6 different medications over the course of 24 hours over 10 days, monitoring and assessing pain, and supporting the basic activities of daily living), I began to notice where I was overriding my own body to get through to the next decision.

I kept ignoring my hunger, thirst, and exhaustion. Soon, I noticed my attention slipping away. I struggled to grasp all the medications. I couldn't tell a yawn from a yelp. At one point on the second day, I just broke down in tears. Tension built behind my eyebrows. A lump formed in my throat. The pit in my stomach grew deeper. My body’s signals were clear.

To navigate this experience, I turned to a 24/7 available co-pilot: ChatGPT.

Enter 3am assistant

I use AI in my every day life and at work, so it seemed natural to test how it could help me in caregiver mode.

The first use case: help me with this medication table. It was an easy math problem for a computer but I had handwritten tables and checkboxes that were not giving me the confidence of an autogenerated and neatly referenced schedule. Turns out, ChatGPT will NOT tell you what medication to take and when. I was delightfully surprised by the clear boundary. But I wasn't defeated. I was motivated to pivot.

I turned the prompt towards my own care, "I am tired, emotional, and cannot fall asleep out of fear. What should I do? How can I navigate this well?" That's how simple it was. I asked the system what to do when you are faced with a confluence of responsibilities yet lack the personal capacity to fulfill them.

It worked. I got tips on regulation, theories about trauma-informed care, how to discern a real emergency from a perceived one, and most importantly for my sleep: how to convince yourself that the person is okay and it is safe to temporarily turn off.

An important context note here: when we did have an actual emergency in the first 18 hours, I didn't think twice about immediately calling first responders. I instinctively knew when ChatGPT was a good tool or not.

Personal regulation tools for caregivers

Caregivers who are safe, secure, and well-rested at home have a direct impact on a patient's recovery - a truth I learned from my own experience. However, the gap between clinical instructions and embodied capacity is not unique to me. We need to think more critically about providing caregivers with co-regulation tools, not just better-designed take-home pamphlets. These tools would help caregivers support themselves while supporting others.