Experienced Nurses Can Be Disciplined If They Use Hunches from Clinical Observations to Override AI Protocols

(p. A1) Melissa Beebe, an oncology nurse, relies on her observation skills to make life-or-death decisions. A sleepy patient with dilated pupils could have had a hemorrhagic stroke. An elderly patient with foul-smelling breath could have an abdominal obstruction.

So when an alert said her patient in the oncology unit of UC Davis Medical Center had sepsis, she was sure it was wrong. “I’ve been working with cancer patients for 15 years so I know a septic patient when I see one,” she said. “I knew this patient wasn’t septic.”

The alert correlates elevated white blood cell count with septic infection. It wouldn’t take into account that this particular patient had leukemia, which can cause similar blood counts. The algorithm, which was based on artificial intelligence, triggers the alert when it detects patterns that match previous patients with sepsis. The algorithm didn’t explain (p. A9) its decision.

Hospital rules require nurses to follow protocols when a patient is flagged for sepsis. While Beebe can override the AI model if she gets doctor approval, she said she faces disciplinary action if she’s wrong. So she followed orders and drew blood from the patient, even though that could expose him to infection and run up his bill. “When an algorithm says, ‘Your patient looks septic,’ I can’t know why. I just have to do it,” said Beebe, who is a representative of the California Nurses Association union at the hospital.

As she suspected, the algorithm was wrong. “I’m not demonizing technology,” she said. “But I feel moral distress when I know the right thing to do and I can’t do it.”

. . .

In a survey of 1,042 registered nurses published this month by National Nurses United, a union, 24% of respondents said they had been prompted by a clinical algorithm to make choices they believed “were not in the best interest of patients based on their clinical judgment and scope of practice” about issues such as patient care and staffing.” Of those, 17% said they were permitted to override the decision, while 31% weren’t allowed and 34% said they needed doctor or supervisor’s permission.

. . .

Jeff Breslin, a registered nurse at Sparrow Hospital in Lansing, Mich., has been working at the Level 1 trauma center since 1995. He helps train new nurses and students on what signs to look for to assess and treat a critically ill or severely injured patient quickly.

“You get to a point in the profession where you can walk into a patient’s room, look at them and know this patient is in trouble,” he said. While their vital signs might be normal, “there are thousands of things we need to take into account,” he said. “Does he exhibit signs of confusion, difficulty breathing, a feeling of impending doom, or that something isn’t right?”

. . .

Nurses often describe their ability to sense a patient’s deterioration in emotional terms. “Nurses call it a ‘hunch,’ ” said Cato, the University of Pennsylvania professor who is also a data scientist and former nurse. “It’s something that causes them to increase surveillance of the patient.”

. . .

At UC Davis earlier this spring, Beebe, the oncology nurse, was treating a patient suffering from a bone cancer called myeloid leukemia. The condition fills the bones with cancer cells, “they’re almost swelling with cancer,” she said, causing excruciating pain. Seeing the patient wince, Beebe called his doctor to lobby for a stronger, longer-lasting pain killer. He agreed and prescribed one, which was scheduled to begin five hours later.

To bridge the gap, Beebe wanted to give the patient oxycodone. “I tell them, ‘Anytime you’re in pain, don’t keep quiet. I want to know.’ There’s a trust that builds,” she said.

When she started in oncology, nurses could give patients pain medication at their discretion, based on patient symptoms, within a doctor’s parameters. They gave up authority when the hospital changed its policies and adopted a tool that automated medication administration with bar-code scanners a few years ago.

In its statement, UC Davis said the medication tool exists as a second-check to help prevent human error. “Any nurse who doesn’t believe they are acting in the patient’s best interests…has an ethical and professional obligation to escalate those concerns immediately,” the hospital said.

Before giving the oxycodone, Beebe scanned the bar code. The system denied permission, adhering to the doctor’s earlier instructions to begin the longer-acting pain meds five hours later. “The computer doesn’t know the patient is in out-of-control pain,” she said.

Still, she didn’t act. “I know if I give the medication, I’m technically giving medication without an order and I can be disciplined,” she said. She watched her patient grimace in pain while she held the pain pill in her hand.

For the full story, see:

Lisa Bannon. “Nurses Clash With AI Over Patient Care.” The Wall Street Journal (Friday, June 16, 2023): A1 & A9.

(Note: ellipses added.)

(Note: the online version of the story has the date June 15, 2023, and has the title “When AI Overrules the Nurses Caring for You.”)

Leave a Reply

Your email address will not be published. Required fields are marked *