Contrary to what you might have guessed, the “Defeat of the Situation Awareness Demons” is not a new video game on XBOX or Playstation. It is a set of eight (8) factors which undermine effective Situation Awareness. It can be applied to operators in process plants to characterize human error when responding to alarms.

But first a bit about Situation Awareness. It originates from the study of human factors in the airline industry (how pilots respond to flight emergencies and comprehend all the gauges, knobs, and switches in the cockpit). Situation Awareness (SA) can be defined as “being aware of what is happening around you, and understanding what that information means to you now and in the future.”[1] So it is not only relevant to airline pilots, but also to process operators (who can be thought of as the “pilots of the process”). Situation Awareness drives effective decision making and performance. SA Demons erode decision making and performance.

If the SA Demons described below are present in your control room, then you might be in store for some failures in how operators respond to alarms:

  • Attentional Tunneling – Focusing on one area or issue to the extent that alarms from another area or issue are excluded.
  • Requisite Memory Trap – Over dependence on operator memory for effective response to alarms, perhaps under different operating conditions.
  • Workload, Anxiety, Fatigue, and Other Stressors – (WAFOS) physical or psychological stress that can impact performance
  • Data Overload – Over dependence on the operator to make sense of excessive amounts of alarm information.
  • Misplaced Salience –  Incorrect alarm priority or HMI representation of alarm importance and other status information
  • Complexity Creep – complex response or complex HMI features that increase response complexity
  • Errant Mental Models –  Thought process that incorrectly interprets alarms or mistakenly discounts relevant alarms
  • Out-of-the-Loop Syndrome –  Process or response automation that causes the operator to not know what to do when automation fails [2]


1. Mica Endsley, “Designing for Situation Awareness.”

2. Dunn, D., Sands, N. and Stauffer, T. “When Good Alarms Go Bad:  Learnings from Incidents,” Texas A&M Instrumentation Symposium (2015).

Tagged as:     Todd Stauffer     Alarm Management  

Other Blog Posts By Todd Stauffer