Systems Thinking

Document Type


Publication Date


Publication Title

Critical Care Nurse


alaska; anchorage; pamc


Recently I caught myself, after preparing a bowl of raisin bran, about to put the milk in the cupboard and the cereal box in the refrigerator. Another time, I remember my mother trying to unlock a car that was not hers in a busy parking lot. Researcher and author Don Norman calls these types of errors “slips.” A slip happens when a person intends to do one action and actually does something else. Many times, the right action (putting the milk away) is performed on the wrong object (the cereal box). In health care, a similar example would be a nurse hanging a bag of heparin when intending to hang a bag of norepinephrine. Health care slips may have more significant consequences than spoiled milk.

When slips happen, does it mean that we are not intelligent, competent, or responsible? Not at all, according to Norman. He states, “Humans err continually; it is an intrinsic part of our nature.”1(p66) People are not machines; slips are a part of normal human behavior. In fact, slips tend to happen more often to experts than to novices. With experience, the subconscious may perform routine tasks automatically, whereas a novice needs to pay significant attention because each task is new.

A part of the solution is to design systems that will prevent, catch, or fix natural human slips to avoid error. Recognizing that it is easy for a nurse to pick up the wrong medication when infusion bags look like each other, many institutions have implemented a system of barcode scanning to catch slips. Another ingenious example of a system safeguard is an enteral feeding connector (ENFit, GEDSA) that creates a physical barrier to intravenous administration of enteral formulas. A low-technology example of safer system design could be an interruption-free zone for medication preparation. Let us use the power of system thinking to make health care safer for all.


Critical Care Medicine