If you want to understand the complexities and pitfalls of automating medicine (and professional work in general), please read Bob Wachter’s story, adapted from his new book The Digital Doctor, of how Pablo Garcia, a 16-year-old patient at the University of California’s San Francisco Medical Center, came to be given a dose of 38 ½ antibiotic pills rather than the single pill he should have been given. (Part 1, part 2, part 3; part 4 will appear tomorrow.) Pretty much every problem with computer automation that I write about in The Glass Cage — automation complacency, automation bias, alert fatigue, overcomplexity, distraction, miscommunication, workload spikes, etc. — is on display in the chain of events that Wachter, himself a physician, describes.
It’s a complicated story, with many players and many moving parts, but I’ll just highlight one crucial episode. After the erroneous drug order enters the hospital’s computerized prescription system, the result of (among other things) a poorly designed software template, the order is transmitted to the hospital’s pill-packaging robot. Whereas a pharmacist or a pharmacy technician would almost certainly have noticed that something was amiss with the order, the robot dutifully packages up the 38 ½ pills as a single dose without a second’s hesitation:
The robot, installed in 2010 at a cost of $7 million, is programmed to pull medications off stocked shelves; to insert the pills into shrink-wrapped, bar-coded packages; to bind these packages together with little plastic rings; and then to send them by van to locked cabinets on the patient floors. “It gives us the first important step in eliminating the potential for human error,” said UCSF Medical Center CEO Mark Laret when the robot was introduced.
Like most robots, UCSF’s can work around the clock, never needing a break and never succumbing to a distraction.
In the blink of an eye, the order for Pablo Garcia’s Septra tablets zipped from the hospital’s computer to the robot, which dutifully collected the 38 ½ Septra tablets, placed them on a half-dozen rings, and sent them to Pablo’s floor, where they came to rest in a small bin waiting for the nurse to administer them at the appointed time. “If the order goes to the robot, the techs just sort it by location and put it in a bin, and that’s it,” [hospital pharmacist] Chan told me. “They eliminated the step of the pharmacist checking on the robot, because the idea is you’re paying so much money because it’s so accurate.”
Far from eliminating human error, the replacement of an experienced professional with a robot ensured that a major error went unnoticed. Indeed, by giving the mistaken dose the imprimatur of a computer, in the form of an official, sealed, bar-coded package, the robot pretty much guaranteed that the dispensing nurse, falling victim to automation bias, would reject her own doubts and give the child all the pills.
The problems with handwritten prescriptions — it’s all too easy to misinterpret doctors’ scribbles, sometimes to fatal effect — are legendary. But solving that very real problem with layers of computers, software templates, and robots introduces a whole new set of problems, most of which are never foreseen by the system’s designers. As is often the case in automating complex processes, the computers and their human partners end up working at cross-purposes, each operating under a different set of assumptions. Wachter explains:
As Pablo Garcia’s case illustrates, many of the new holes in the Swiss cheese weren’t caused by the computer doing something wrong, per se. They were caused by the complex, and under-appreciated, challenges that can arise when real humans — busy, stressed humans with all of our cognitive biases — come up against new technologies that alter the work in subtle ways that can create new hazards.
The lesson isn’t that computers and robots don’t have an important role to play in medicine. The lesson is that automated systems are also human systems. They work best when designed with a painstaking attentiveness to the skills and foibles of human beings. When people, particularly skilled, experienced professionals, are pushed to the sidelines, in the blind pursuit of efficiency, bad things happen.
Pablo Garcia survived the overdose, though not without a struggle.
Mr. Carr, just had a blast watching you from The Old South Church on C-Span…first time I’ve seen you and the other 2 gentlemen.
I really loved the panel.
Did, or does, it pop into your mind that we don’t ALL have to have the same motive for our ‘Commercial Flight’, for example? Those of us who just want the ‘efficiency’ of the non-piloted trip – as the gentlemen on your far-right wants – can have it. And fellas’ like you and me can have the more ‘personal’/’natural’ flight with a human pilot, right?
It seems to me discussions like this great one y’all just had are always done with the backdrop, and argument about, what is ‘right’. I would sure think you’d agree that we can each decide for ourselves what is ‘right’…especially having to do with the technology question y’all discussed so well on C-Span this evening. I’ve grown to simply love ignoring technologies that, indeed, make me, let’s say, ‘less human’ in my own mind. I’d think you’ve probably already written something like this outlook, eh? Thanks, again, for your great thoughts – and cogent expression of them.
How many times have humans made a similar mistake? How you think individual pills, tablets and capsules are manufactured? Hint: It is not normally a manual process.
Doctors write the prescriptions. What does a pharmacist do? Take pills from a big bottle and put them in a smaller bottle. Drug interactions and patient history are in a database. The pharmacist doesn’t add any value aside from claiming a ridiculous things like dispensing fees.
This seems like a good function to automate.
I can see getting my prescription from my physician and picking it up from a machine immediately on the way out of the building. Bring it on!
(Who paid you to write this article? A pharmacist’s association?)
CraigS,
I’m curious as to how often you deal with the Medical Industry, as regards your health? I’m commenting as a fairly recently diagnosed, with one of the most female killer cancers, cancer patient who has been utterly traumatized by the stunning downsizing of KNOWLEDGEABLE (i.e. non 1-800/Call Center! numbers) contactable human beings being involved at all key points in the process. I’m tempted to say you either don’t have a clue what you are talking about, or, you are one of the very, very, few who make enough money (and specifically one of the Elite Few who are actually able to contact someone with Experience, Knowledge, and The Ability To Correct Blatant Techno Utopia Errors Before They Become Deadly) when things go awry to not have to worry about it.
Software and Hardware could give a rat’s ass as to harm done. Not only that, Software and Hardware are currently being created predominately for obscene profit (NOT FOR THE BETTERMENT OF HUMANS). Software and Hardware created by human beings whose, many times, stunningly self serving ideologies cannot help but be imparted into those softwares and hardwares; created by human beings who are fully aware they are obsolescing one on one human contact (and livings, in a world where one must “make” money in order to live) via their techno utopia ‘product.’ On the other hand, humans – whose vocation is one on one contact with their fellow humans – mostly do care if they do harm.
(Not to even mention the fact that I ‘read’ you as hideously telling thousands (or is it millions?) of Pharmacists that they have now been obsolesced, so they better go look for a retail clerk/server job at minimum wage, and the coziest spot they can find under a bridge, or behind some vast elite shrubbery, to be homeless.)
Contrary to what you imply: counting and distributing pills is not rocket science. Paying a human to do a menial job like pill distribution does not mean you will get quality. The chance of a human beating a machine over time is almost nil. The only reason there is a story here: A few errors were reported across a very small sample. Wanna find the number of errors made daily by humans do the same repetitive job? Good luck.
Give me a choice. I’ll take a monitored machine any time over a human with almost zero oversight, counting pills day in, day out.
http://www.cbc.ca/news/health/pharmacy-errors-how-often-do-they-happen-nobody-knows-1.2920158
Craig,
You might want to actually read Wachter’s article and my post before jumping to silly conclusions. No one’s suggesting that the automated systems be unplugged. What’s under discussion is how these systems are designed and how roles and responsibilities are divided between people and software. As Wachter’s piece makes clear, the systems and related processes need to be designed with great care to minimize errors.
In complicated processes like hospital medication prescribing and delivery, the question isn’t a simple either/or: either a machine does it or a human does it. The processes always involve both machines and humans, and the question is, how do we get the best out of both?
Nick
I read through the article. At first, I thought the nurse that administered the 39 pills was to primarily to blame. Followed very closely by the admitting physician. On second thought, I would lay more blame on the admitting physician. Unlike the nurse, the doctor that placed the order ought to have a much better concept of what was a proper dosage… and didn’t actually read the printout. However I would lay the blunt of the blame on the software designers.
Since the creation of MYCIN, about *forty* years ago, I figured domain experts understood the problem of administering antibiotics according to weight. … Yes the UI could have been better designed. However I feel the software designers made the near fatal error by failing to add a simple sanity check. After all, I’ll assume it was the computer system that would flag the 5% problem? Regardless of the units entered, the requested dosage ought to be checked against what was acceptable for a person Pablo’s size. If the dosage wasn’t acceptable, an ominous dialogue box should have appeared saying something to the following:
WARNING!
Recommended dose for a 38.5 kg person: *1*
reason: int((5mg * 38.5 kg ) / 160)
Note: one tablet = 32kg * 5mg/kg or 160
————-
Requested dosage: *39*: THIS IS PROBABLY LETHAL!!!
Do you really mean this?
As for the constant wave of alerts and notifications. I am new to the field of home automation. It is only recently that I learnt about the field of alarm management that comes out of industrial process control.
Cheers,
Andrew
Oops, sent off that last message too quick. I would round the number to the closest integer…. Still an important lesson from the Therac 25 radiation therapy accidents is it is important to give meaningful messages to the operator. Also it helps to what the Japanese manufacturers call poka-yoke (or mistake proof) a system.