Decision Theory for the Fire Service

The blind leading the blind

By Marc Coenen

Photos by author except where noted

In the early hours of Sunday August 11, 2019, a fire erupted in an abandoned supermarket in Beringen (Belgium). Two firefighters lost their lives and a third one was severely burned. That fateful night, I was the fire officer on duty. For me, a long search for answers began.

A Philosophy of Science for the Fire Service | Brain and Fire: Thinking Under Pressure

Incident reports combined with the writings of philosophers of science, neurologists, and psychologists provide an interesting insight into the decision-making process in the run-up to incidents with a fatal outcome. The starting point is the way our mind filters out sensory cues for further processing and the unconscious mechanisms that determine the cognitive framework or mental model wherein all those bits of information will be interpreted. When you intuitively end up in a wrong cognitive framework, the result is a false hypothesis. If that hypothesis deviates too much from what is really going on at the incident ground, tragedy looms.


Anatomy of a Fatal Fire

Anatomy of a Fatal Fire. What Have I Done Wrong?

Marc Coenen offers profound insights into the conscious and unconscious processes that shape our actions on the fireground.


Modern life is played out in dynamic environments that are bursting with stimuli that scream for attention. Our brain simply cannot process the huge volumes of information we are bombarded with all day long. The sensory cues that we take in from the world around us must be sifted. All the sensations that enter the sensory memory through the sense organs are filtered for the first time based on attention. Only those things to which you devote attention gain access to the working memory or short-term memory, where they are processed into perceptions. A person—or, more correctly, a person’s brain—does this automatically and instantaneously. For the mind is the place where past, present, and future meet. Your handling of current situations is affected by experiences from the recent past and predictions or expectations about the near future. Knowledge acquired not only interferes with the perception of the situation you are in, but is also integrated into your judgment about how the situation will develop in the near term. On the basis of experience and expectation, you separate important from unimportant information automatically and effortlessly. The limits of the working memory in terms of how much information it can handle at any given time are the next filter: 60 bits per second, which is equal to the “magic number” of seven letters or numbers plus or minus two. That is all. How much of this processing capacity is already in use for other conscious thought processes at a given moment limits the amount of additional input you can handle. When workload demands are high, the remaining memory-storage capacity is so small—or even zero, when you exceed it—that nothing can enter the mind for further processing. The entrance is simply blocked. Taking in information and processing it requires a lot from our brains. We really do not go through all that trouble to simply ignore that input subsequently.

Los Rodeos plane crash
Wreckage on the runway of Los Rodeos after the Tenerife airport disaster of March 27, 1977. (Public domain photo)

Sunday, March 27, 1977. Los Rodeos

The interplay of attention and workload on the processing capacity of our brain is known as the filter effect. Not only at a fireground but also in our day-to-day life, the filter effect can result in a false hypothesis. It can easily mislead us and let us only see and hear what we want to see and hear or make us disregard information to which we ought to have paid attention. This happened on Sunday, March 27, 1977, when in the Canary Islands two Boeing 747 airliners collided on the runway at takeoff. With a total number of 583 lives lost, the infamous Tenerife airport disaster is still the deadliest accident in civil aviation history. The digital flight data recorders and the cockpit voice recorders shed a surprising light upon the causes of that accident.

The day of the collision, it was extremely busy at Los Rodeos Airport (now Tenerife North Airport). At 12:30 p.m. a bomb exploded at a flower shop in the departure lounge of Las Palmas Airport (now Gran Canaria International Airport), and the terrorist claimed to have hidden a second explosive device. Dozens of flights en route to Las Palmas Airport were diverted to Los Rodeos, as were flights KLM 4805 and Pan Am Clipper 1736.

The regional airport of Los Rodeos is not big. It possesses only a single runway to handle both arriving and departing flights. One taxiway runs parallel to the runway. Four exits connect both lanes, but they are not numbered or clearly marked. There is hardly enough space to park the diverted planes on the tarmac. The end of the taxiway is full, and planes also block exits one and two. At 3:00 p.m. the eagerly awaited message was received that Las Palmas Airport is finally to be reopened. It was then up to air traffic control to get the aircraft to the departure point at the end of the runway. Air traffic control let the planes taxi in pairs down the runway, where they subsequently were to take exit three or four and continue their way to the departure point on the taxiway. Two aircraft had already taken off when it was the next pair’s turn. Flight KLM 4805 was requested to taxi down the runway and make a 180-degree turn at the end. At the departure point, the pilots were to await further instructions. Three minutes later, Pan Am Clipper 1736 was also to begin its taxi down the runway, but its pilots were instructed to leave the runway at the third exit and use the parallel taxiway for the reminder of their taxi.

Los Rodeos Airport is situated near the coast at an altitude of 631 meters. At any given moment, cloud banks can drift in from the sea. Perfect visibility can suddenly change into poor. While the Pan Am Clipper 1736 was taxiing down the runway, cloud banks floated onto the runway. Because of the poor visibility, the pilots missed the third exit and continued their taxi on the runway. The air traffic controller couldn’t see the planes through the thick fog, nor was there visible contact between the jumbo jets. Furthermore, because the airport was not equipped with ground radar, the position of the aircraft could only be established by radio transmission. Let us take a closer look at the exchanged radio messages. For convenience, I have divided that communication into two phases.

Phase One: The Alleged Permission

The world around us influences you and me via the sense organs. We figure out what we are going to do in reaction to those sensory cues and we start to interact with the world. Our brain simply cannot resist interpreting words and gestures of fellow human beings so as to make sense out of them. For this task, it can draw on a rich arsenal of tools. It can jump to a more suitable mental model, fill in blanks at will, interpret ambiguities as a function of our own expectations or desires, and so on. Unfortunately in Tenerife that day, the radio messages left plenty of room for interpretation. The filter effect also came into play. “The sooner we get out of here, the better.” After an unexpected delay of several hours, that was the cognitive framework of the KLM captain. His focus was on leaving the airport as soon as possible, and he interpreted all the communication in that manner.

As instructed by the air traffic controller, the Dutch Boeing 747 turned 180° at the end of the runway. Then the captain slightly increased the engine power to check the spin-up before departure. The first officer responded and said: “Wait a moment, we don’t have ATC clearance yet.” The captain replied: “No, I know. Go ahead and ask.” As instructed, the first officer contacted the air traffic controller via the onboard radio: “The KLM 4805 is now ready for takeoff and we are waiting our ATC clearance.” The aircraft stood still at the departure point on the runway and waited for clearance.

For takeoff, a commercial airliner must get two clearances:

  • Air traffic control (ATC) clearance
  • Takeoff clearance

Takeoff clearance is the permission to take off from the runway. The ATC clearance contains the route the aircraft must follow within controlled airspace in order to avoid a collision with other aircraft. The air traffic controller replied: “KLM eight seven zero five, you are cleared to the Papa Beacon and maintain flight level niner zero, right turn after takeoff. Proceed with heading zero four zero until intercepting the three two five radial from Las Palmas VOR.” Toward the end of this ATC clearance, the captain said: “Yes.” Apparently, he heard the air traffic controller saying something he wanted to hear. As standard procedure prescribes, the first officer read back the ATC clearance to the control tower. Readback is the technical term for repeating the message by the receiver. It allows the sender to check whether the receiver understood it correctly. But during this readback, the captain increased the throttle, disengaged the brakes, and said to his fellow crew members: “We are going.” All his actions spoke for themselves. The captain thought he had permission to depart and thus they left. Meanwhile, the first officer was still reading back the ATC clearance to the control tower. He ended that transmission with the somewhat unusual message: “We are now at takeoff” or “We are uh takin’ off.” What exactly the first officer said is not clear on the tape.

All these communications between the KLM aircraft and the control tower took only one minute and 16 seconds, but they were decisive. What could have given the captain the impression that he had permission to take off? To be absolutely clear, he was not authorized to depart. It started to go wrong when the first officer informs him that they do not have ATC clearance. With the captain’s mind set on leaving, that sentence sufficed to create the impression that ATC clearance is the only permission still needed. Then the first officer informed the air traffic controller that they are ready for takeoff and await ATC clearance. This radio message was nothing less than an implicit request for two permissions. It consolidated the false hypothesis that one permission suffices for both clearances. The captain’s “Yes” on the cockpit voice recorder and the takeoff procedure that he subsequently initiated make it abundantly clear that he had interpreted the ATC clearance in this manner. The use of the word “takeoff” in the ATC clearance enlarged the misconception even further. In your cognitive framework, the filter effect lets you hear (or see) what you want to hear (or see). Here the captain wanted to hear that he was allowed to take off, so he interpreted all ambiguities in that sense.

Phase Two: Missed Ways Out

The unusual sentence “We are now at takeoff” or “We are uh takin’ off,” spoken by the first officer, was interpreted in different ways by the air traffic controller and the crew of the Pan Am Boeing. The air traffic controller replied with: “Okay. . . . Stand by for takeoff. . . . I will call you.” The italicized part of this message was barely audible in the KLM cockpit as it is overlayed with a high-pitched squeal. I’ll come back to it later. In the mind of the air traffic controller, the Dutch plane was waiting on the runway. The crew of the Pan Am aircraft had another point of view. According to them, the KLM Boeing was already initiating takeoff. Immediately after the “okay” of the air traffic controller, they broke into his transmission and say: “No, uh . . . and we are still taxiing down the runway, the Clipper one seven three six.” The air traffic controller requested them to inform him when they are off the runway. The Pan Am crew replied: “Okay, we’ll report when we’re clear.” Back in the cockpit of the KLM jumbo jet, the crew could hear the messages exchanged between the American Boeing and the air traffic controller. While they were picking up speed on the runway, the flight engineer hesitantly asked his colleagues: “Is he not clear then?” The captain responded: “What do you say?” The flight engineer repeated his question: “Is he not clear then, that Pan American?” The captain and the first officer responded almost simultaneously: “Sure!” A few seconds later, the recording stops with the sound of the crash.

At a certain point in time, the air traffic controller and the Pan Am crew were transmitting messages simultaneously on the same frequency. As a result, crucial information was missed by the Dutch pilots (the italicized part of the air traffic controller’s message from earlier). The air traffic controller started his message with “Okay,” which was followed by a pause of two seconds. “Okay” is his catch phrase, which only indicated that he has heard the message; English was not his mother tongue. With the subsequent silence, the air traffic controller bought himself a few seconds to formulate an answer: “Stand by for takeoff. . . . I will call you.” In the cognitive framework of the Dutch crew, his “okay” was the confirmation that they are indeed cleared for takeoff. During the short silence of two seconds, the Pan Am Clipper 1736 broke in with the message that they are still taxiing down the runway. The whole time, however, the air traffic controller had been holding down the transmit button, resulting in two transmissions on the same frequency at the same time. The radio interference after the “okay” caused a high-pitched squeal in the KLM cockpit. Only if you listen carefully can you hear on the tape in a somewhat distorted, though understandable, voice the air traffic controller saying: “Stand by for takeoff…I will call you.” But after his “okay,” the attention of the Dutch captain and his first officer were no longer on the radio traffic. It completely shifted to the procedure for takeoff. During takeoff, their workload is so high that in their working memory there is no more processing capacity available for new input. Everything is filtered away.

The control tower’s response to the Pan Am crew’s message also got lost, although by then the squeal was gone. Their answer failed to capture the attention of the Dutch crew, as the air traffic controller used for the first and only time the call sign Papa Alpha 1736 instead of Clipper 1736. The mental filter of the Dutch crew stood open for Clipper, but not for Papa Alpha. That supposedly other airplane was the least of their worries, so why bother paying attention to that message? When the first officer of the Pan Am Boeing read back the message of the air traffic controller and said “okay, we’ll report when we’re clear,” the flight engineer of the KLM plane must have recognized his voice. That is no coincidence. During takeoff, his workload was the lowest of the flight crew. He still had some processing capacity available. Hesitantly he asked his colleagues whether the American aircraft had left the runway. The quasi-simultaneous “Sure!” of the captain and the first officer left no room for doubt. It is only a few seconds later when the Pan Am plane suddenly emerges from the thick fog that the false hypothesis was shattered. By that time and at that speed, the collision was inevitable.

After more than 45 years, this accident is still food for thought, including for the fire service. During fires, the incident commander (IC) usually stands outside while his colleague firefighters are working inside the building. They do not see one another. Information can only be shared over the radio. The view the IC has of the incident from the outside can differ substantially from that of his colleagues busy inside the building. For the various parties involved, this can lead to different cognitive frameworks. This, in turn, can influence how a radio message is interpreted, with a single radio transmission interpreted in a variety of ways by the hearers. The interpretations can even diverge to a dangerously wide extent. And then we even assume that the radio message is picked up by the crew inside. In the heat of the moment, many radio messages from the outside might simply not be picked up by the teams working inside, for they are focusing their attention (visual and auditory!) on one task: extinguishing the fire. Moreover, the workload they are facing is at an all-time high. At such moments the crew members inside are unlikely to notice something unexpected, for example, a general call via the radio or the evacuation horn of their fire truck. These warnings fall on literally deaf ears as the crew members suffer from inattentional deafness.

View on the incident ground from the block of flats behind the supermarket
View on the incident ground from the block of flats (building B) behind the supermarket.

Sunday, August 11, 2019. Beringen

Radio messages that were misinterpreted led to the biggest disaster in the history of civil aviation. What made things go so wrong that Sunday night in Beringen?

Upon arrival, I found a blazing fire at the front of a derelict supermarket in Coal Mine Lane. Luckily the fire was not located deep inside the building. The greatest danger could be dealt with by an exterior attack. I assumed we were dealing with arson and therefore there might be multiple seats of fire inside the building. All cues of the fire (smoke, temperature, flames, … etc.) were a perfect fit for a ventilated fire development. While I conducted my 360-degree size-up around the supermarket, I noticed that at several places light black-grey fumes slowly flowed from the building. I could explain perfectly what I saw. The raging fire at the front of the shop placed the whole structure in overpressure. The building was leaking like a sieve, and through all possible openings the smoke produced by the blazing fire was gently pushed out of the building.

After the fire at the front of the supermarket had been extinguished, the pattern of the smoke did not change. The fumes just kept on flowing slowly from the building. There must be another seat of fire inside the building. Along the side of Saint Barbara Street I sent in another team of firefighters. Shortly after entering the building, they reported to have found a kitchen fire. My assumption was confirmed. This second fire was the reason why smoke kept on coming out of the building. Then the scenario changed and, from one moment to the next, nothing made sense anymore. Pitch black smoke dropped out of the ceiling. The visibility inside was reduced to zero. Immediately, the hot smoke ignited, and the whole building lit up.

The fire truck of Heusden-Zolder and the fire tanker truck of Beringen at the back of the supermarket
Sunday night, August 11, 2019. The fire apparatus of Heusden-Zolder and the fire tanker truck of Beringen at the back of the supermarket. To the left, the façade of building B. (Photo by Hans Put)

According to the Israeli-American psychologist Daniel Kahneman fire officers decide upon their strategy on the basis of a mixture of automatic and conscious mental processes that he has coined WYSIATI, which stands for What You See Is All There Is. Your strategy is basically an automatic or unconscious thought process: a lightning-fast, intuitive conclusion based upon your first impression. Your subjective confidence in that judgement depends almost entirely on the internal consistency of the narrative that your associative brain has spun quickly and automatically from the information at hand at that time, while neither the amount nor the quality of that information counts for much.

In everyday life, too, we come to conclusions quickly and intuitively on the basis of limited information or flimsy evidence. For example, in casual encounters on the street, your first impression of a person has already been formed on the basis of perceived facial expressions and attitude long before you have talked to that person. How does this work? In the higher, associative areas of the cerebral cortex, previously acquired knowledge and prior experiences are stored in the form of latent, potential patterns of neural activity. A certain cluster of neurons or brain cells begins to send tiny electrochemical signals to another cluster with a certain pattern, during a certain time and at a certain frequency. This lets you unconsciously recognize the situation you are dealing with and allows you to instantaneously decide upon which strategy you will deploy.

The original seat of fire is demarcated by barrier tape in the right back corner of building 3
The original seat of fire is demarcated by barrier tape in the right back corner of building 3.
The second team of firefighters entered the building via the white door of the former cold storage of the supermarket
The second team of firefighters entered the building via the white door of the former cold storage of the supermarket (building 8).

At the start of an incident, you always have much less information at your disposal than at the end. This affects you as an IC more than you might think. The scarcer the information, the faster you jump to conclusions. At that moment, your intuition provides you the certainty that facts cannot offer. All uncertainties and doubts are suppressed. This isn’t helped by the fact that our conscious thought processes barely check the very logical-sounding answers immediately provided by intuitive thinking.

Your first impression at the incident ground can be riddled with fallacies and stereotypes. There is no guarantee that you will succeed in unmasking each one of these mental pitfalls during the 360° size-up. On the contrary, due to confirmation bias you may begin to look for those bits of information that are consistent with your first impression. You will focus on those details and aspects that make sense to you and assign too much weight to them, however few those elements might be. You will not notice or question or will reason away crucial pieces of information that are in conflict with your strategy. Basically, your brain lets you read into the situation what you want to read. As you have gained the impression that your assessment of the situation is correct, your subjective confidence in the strategy begins to grow, but this does not make the scenario you have put together more likely. There is no problem when your assessment of the situation corresponds sufficiently with what is really going on. If not, the consequences can be dramatic. Like that night in Beringen. No one noticed that the raging fire and the kitchen fire were in fact masking a structure fire in the flat roof of the dilapidated supermarket. Only at the moment that the scenario changed and hot smoke came crashing down out of the ceiling did the false hypothesis burst like a soap bubble.

Once you have accepted your own theory (or strategy) and applied it in your reasoning, you become blinded by it, and to come up with new or alternative solutions is hardly possible. Your own theory or mental model is extremely important and remains most influential. Without that framework, you stand nowhere as an IC and are bound to be indecisive. You obviously—or better, intuitively—trust your own theory. Precisely herein lies the danger. You would do better not to take your own assessment of the situation or strategy at face value. During the incident, you must put it to the test, not once but several times. That is easier said than done. Once you are in the flow of an incident, it is anything but easy to come to an entirely different insight based on a new analysis of the same situation. Do not confuse this new analysis with analyzing new information or thinking one or two hours ahead in the fire scenario. It is not about that at all. We are dealing with a fundamentally more difficult proposition: You must question the sense of reality that you are experiencing at that moment. Is your interpretation of the events in tune with what is really going on? Are there no other interpretations possible? Colored or clouded by your first impression, your interpretation might not agree to a sufficient degree with what is actually happening. Los Rodeos demonstrates that questioning our sense of reality is extremely difficult. Even the flight engineer’s hesitating questions were not enough to raise doubts. When you are so focused on takeoff that you have interpreted everything in that sense, then it is simply inconceivable that the runway might not be clear when you are reaching liftoff speed.

By definition, we human beings are blind to all pitfalls in our own perception of the situation. We are only dimly aware of the endless stream of assumptions, misconceptions, and first impressions that are irrevocably entwined with our conscious decisions. We hardly question our sense of reality, not only in the run-up to accidents but also throughout daily life. We adhere to one interpretation and neglect to come up with plausible alternatives. Or as Kahneman and colleagues write: “We hold a single interpretation of the world around us at any one time, and we normally invest little effort in generating plausible alternatives to it. One interpretation is enough, and we experience it as true. We do not go through life imagining alternative ways of seeing what we see.

Buildings 4 and 5 after the fire
Buildings 4 and 5 after the fire. (Photo by Hans Put)

‘What Is Your Impression?’

Los Rodeos and the Beringen fire demonstrate that false hypotheses arise easily. But are there ways to puncture those cognitive illusions? Let us return to the incident in Beringen. That night, the fire truck of Heusden-Zolder arrived at the fireground around 2:50 a.m. My colleagues got out and I led them to the narrow opening in the back wall of the supermarket. That was where I wanted them to enter the building, for they had to locate and extinguish any remaining seats of fire. I told them what we have done so far: “On arrival we found a blazing fire at the front side. That’s extinguished. An additional team was sent into the shop via the right side. They encountered a kitchen fire and have extinguished it. They are now damping down.”

That night, I did what every fireground commander does at each incident ground: I briefed my colleagues and gave a situation report. We do this to create a shared mental model or a shared representation and understanding of the situation. This all seems more obvious than it really is. As the first IC, you lay the foundation for the entire operation. All other decisions will be based on your assessment. You establish the framework wherein the incident will be handled. Little bits of information that you have missed or misinterpreted at the beginning can have a big impact later on. Only after the incident has been dealt with can you know for sure whether your perception of the situation or your sense of reality was sufficiently in accordance with what was actually happening.

But there might be a lot more going on than first meets the eye. What if you miss the ball from the start and slip into a wrong mental model? After all, it is possible that your first impression might be off the mark. Without noticing, you roll into a false hypothesis. Just like at any other incident, you brief your colleagues that arrive later at the incident ground. Before you know it, you end up like The Parable of the Blind, as put on canvas by Pieter Bruegel the Elder in 1568. In this painting, six blind, disfigured men are walking hand in hand. The leader of the group has fallen on his back into a ditch, and the others will follow suit.

Instead of immediately briefing your fellow officers or warrant officers, I think it is better that you first ask them what their assessment is of the situation. Just like the first IC, they also get a first impression upon arrival at the incident ground. Their impression can differ substantially from yours, and this may be for a number of reasons. The main source of variability within decisions is selective attention. Because of differences in training or experience, your colleagues can have another opinion about the incident. Elements that have fallen into your mental blind spot might well have caught their eye. Alternately, they might be able to puncture those sensory cues to which you have assigned too much weight.

The incident itself is another source of variability, especially when your colleagues arrive at the fireground at a later time. Because of the work done by the teams at the incident ground, the initial state that later-arriving colleagues encounter will differ profoundly from that of yours. As the work on scene progresses, more and more information becomes available. All that extra input greatly augments the possibilities for differing points of view. In the case of a false hypothesis, these are all opportunities to burst the bubble.

Since the Los Rodeos accident, international regulations regarding civil aviation have been stepped up. A new model for cockpit interaction known as crew resource management has been implemented worldwide. To reduce human error in the cockpit and improve airline safety, civil aviation goes to great lengths to create an environment in which the captain’s authority can be challenged by colleagues in a respectful manner. Crew members are encouraged to voice their concerns freely to the captain, who must hear them out and reconsider a decision in response to the points raised.

Just like aviation, the fire service has an outspoken hierarchical structure and a general culture emphasizing chain of command. Encouraging officers and warrant officers to question the understanding of the situation upon arrival at the scene is the best chance we have to puncture a false hypothesis. The first IC must briefly check with them whether the strategy/deployment corresponds to their professional judgment and experience. Premature influence must be avoided. People do not like to disagree, which makes it all too easy to come to a quick consensus about a poor judgment—with all the dreadful consequences this entails, for such a false sense of agreement only reinforces the illusion of an alleged accord. Your colleagues must be able to make their own judgments completely independently. In the next phase, their assessments must be aggregated with that of the first IC into a common strategy. The opinion formed independently from you by a warrant officer or another officer is too valuable to be carelessly neglected. The best we can do is to discuss conflicting impressions without getting bogged down in endless talks or—even worse—indecision. The best decisions emerge when teams think together.

As an IC, you must of course be prepared to adopt this way of working and tolerate what it implies for your authority, for only in this way you can take into consideration your colleagues’ assessment of the situation during the decision-making process. You must give your colleagues the necessary space to allow them to voice their concerns.

Only when your colleagues are feeling secure in your team will they be willing to share their ideas and insights with you. In every team, the triangle of trust, honesty, and self-respect is indispensable for open communication. As an IC, you must have confidence in the reports of your colleagues and be willing to believe them and base your decision-making on them. Each team member must report honestly so that all the different observations are available and usable when the decision is made. Furthermore, everyone must respect themselves and their own opinion, because each person must share it with a superior or the group, even when knowing that they adhere to a completely different point of view. The basis for the new consensus or strategy has to be a shared belief, not social influence or pressure.

Can the fire service prevent accidents like the one in Beringen from happening again with a better model of decision-making? Unfortunately not: decisions can, and do, go wrong. Accidents happen and some incidents are more risky than others. Something that is exceptionally unlikely or exceedingly rare is more likely to remain unrecognized. And first responders are only human. Bias and noise are omnipresent in human physiology and psychology. They are an intrinsic part of human nature. The mind will always select and interpret information based on acquired experience and expectations. Decisions are also made in the context of the real world, and some contexts or situations are more prone to bias than others. The blazing fire at the front of the derelict supermarket that Sunday night in Beringen was unfortunately such a context.

Selective bibliography

All information on the Beringen fire and how the mind works, are taken from M. Coenen, Anatomy of a Fatal Fire. What Have I Done Wrong?, Tulsa 2024, and complemented with insights from:

  • C. Chabris and D. Simons, The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us, New York 2010.
  • S. Dekker, The Field Guide to Understanding ‘Human Error’ (3rd ed.), Boca Raton 2014.
  • D. Kahneman, Thinking, Fast and Slow, New York 2011.
  • D. Kahneman e.a., Noise. A Flaw in Human Judgment, London 2022.
  • J. Reason, Human Error, 20th edition, Cambridge 2009.
  • J. Reason, The Human Contribution. Unsafe Acts, Accidents and Heroic Recoveries, London and New York 2016 (=2008).
  • P.A. Roitsch e.a., Human Factors Report on the Tenerife Accident (Aircraft Accident Report. Air Line Pilots Association. Engineering and Air Safety), Washington DC s.a.

Marc Coenen
Photo by Hans Put

Marc Coenen, PhD, an Egyptologist by training, started his career in the fire service as a volunteer officer in the Aarschot Fire Service. Now he is a career officer of the Fire and Rescue Service of South-West Limburg, attached to the fire station of the city of Beringen, Belgium.

Hand entrapped in rope gripper

Elevator Rescue: Rope Gripper Entrapment

Mike Dragonetti discusses operating safely while around a Rope Gripper and two methods of mitigating an entrapment situation.
Delta explosion

Two Workers Killed, Another Injured in Explosion at Atlanta Delta Air Lines Facility

Two workers were killed and another seriously injured in an explosion Tuesday at a Delta Air Lines maintenance facility near the Atlanta airport.