This weekend I was at a dinner party and the discussion, as it usually does when you get a bunch of foodies in a room, turned to cooking. Mostly how bundt cakes are delicious, but they’re a pain in butt because they take forever to make. The eggs and butter have to be room temperature. The cake has to cool before it can be glazed. One of the guests admitted it once took her seven hours to make a single bundt cake. But despite the bundt cake being so high maintenance, we agreed if you follow the recipe the cake does come out tasty even if it’s seven hours later.
On the other hand, if you don’t follow the directions, which I have been known to do on occasion, you end up with something inedible. Like the time I forgot to put eggs in the packaged brownie mix. The first time I visited my sister after she moved to Colorado, she was lamenting about how her rice never turned out. We’re talking Rice-a-Roni rice which is pretty fool proof. I turned over the package and asked what altitude we were at. The reason the rice never turned out? As a transplant from the Midwest she didn’t realize there were different cooking directions for high altitude rice.
I’m a big fan of cookbooks and following recipe instructions to ensure quality outcomes aka brownies you can eat rather than build a house out of. Yet, not everyone is so fond of step-by-step guides, and many of those are healthcare professions. I’m sure you’ve heard the term “cookbook medicine” to describe the use of clinical guidelines or algorithms in clinical decision making. It’s not usually a complimentary term. Those who describe guidelines and algorithms as cookbook medicine feel it takes away professional authority, critical thinking skills and the ability to make decisions if a variance occurs.
I, on the other hand, have recently developed a deep respect—dare I say love—for what algorithms can do. I’ve transitioned to (another) new job that involves telephonic triage nursing. What makes this job so amazing to me, besides the incredible corporate culture, friendly co-workers and the fascinating calls we get, are the algorithms.
I do not feel like I’m working from a cookbook. I choose to look at the algorithms as a map. Just to be clear, maps were these pieces of paper we had that told us what roads to take before GPS was in practically every car or smart phone. With a map you had a pretty good idea of where you wanted to go, say out West if you live in Chicago. However, the maps I was raised with didn’t tell you about road closures or Starbucks locations. If you saw a Road Closed sign you pulled into a gas station to ask for an alternative route. My point is that even when using algorithms you have to ask, you can’t assume.
Your patient has a headache. You still have to ask how long the headache has been present: Did anything like a falling Starbucks sign hit them in the head to start the headache and what does the headache feel like? You have to use your clinical training to know to ask the right questions so you can get to the right algorithm.
Now before you can get a good algorithm in place, you need to understand your workflow and process. If your staff can’t articulate the workflow and procedures they follow to provide care, no algorithm is going to help you. You have to sit down with all the parties involved in patient care and pick apart the process. Only then can you begin to build an algorithm.
Algorithms also need to be evidence-based. The builders of the algorithms need to scour the literature for evidence-based practice and patient outcomes (this would be a good time to enlist your friendly hospital medical librarian or cybrarian). If the literature says it’s a bad idea to put butter on a burn, include that in the algorithm (i.e., “Has patient self-treated burn?”). You answer yes or no. If the answer is yes, then you use your clinical judgement and ask, “With what?” When they answer butter, you tell them to go wash it off and then explain the proper way to care for a burn.
For the algorithms to be successful, the writers of the pathways need to be committed to continually looking at evidence and updating the algorithm as needed. They must train staff on changes and give solid explanations for the changes. They must also be committed to performing quality assurance on the algorithms. Data is a powerful thing. It may show that an algorithm is contributing to successful outcomes or it can also show that it’s having detrimental effects on patient outcomes. Those in charge of the algorithms have to investigate why that is and update the algorithm to improve the outcomes.
After using algorithms the past few weeks, I firmly believe they let healthcare providers focus more on the patients not less. Something is always there to remind you not to forget to ask a question and suggest the appropriate action to take. You don’t have to remember all the little nuances of policies and procedures because they can be built into the algorithm. And for that small percent of folks who don’t fit into an algorithm, like the 92-year-old who takes an Ambien and instead of falling asleep starts doing back flips down the hall, there should always be the option for a clinician to override an algorithm if necessary.
If you’re skeptical about the validity of algorithms just remember the inventor of the Internet used one…the Al Gore Rhythm. (rim shot here)
Jennifer Thew, RN, MSJ
Latest posts by Jennifer Thew, RN, MSJ (see all)
- Get Schooled: 4 Helpful Patient Education Tools - October 9, 2014
- Focusing on Fun and Interactivity Can Improve Health Outcomes - August 28, 2014
- Rules of Engagement: Engaged Patients Have Better Outcomes - August 12, 2014