A Multiple Resource Approach to Pilot Safety
- Editorial Team

- Apr 10
- 4 min read
Updated: 5 days ago
New here? Subscribe to our blog to receive our latest insights directly in your inbox.

The 21st-century pilots are not limited by technology. They are limited by attention.
Despite the sophistication of 21st-century flight decks, pilots are losing the battle against cognitive overload. As modern flight systems demand increasing visual and auditory focus, the brain’s ability to process simultaneous alerts begins to fracture.
When sensory channels become saturated, performance does not decline gradually: it collapses abruptly, and critical information is filtered out at the very moment it matters most.
This challenge is best understood through Multiple Resource Theory, which proposes that the brain does not rely on a single pool of attention, but on several independent resource systems tied to different senses, processing types, and ways of thinking.
The implication is simple but profound.
Safety does not improve by adding more information to an already overloaded channel.
Safety improves by distributing information across channels the brain can use.
To break through, we must look beyond what pilots see and hear, and start focusing on what they feel.
The Problem: When More Information Means Less Awareness
For decades, aviation advancement has prioritized visual and auditory interfaces, from Head-Up Displays (HUDs) and 3D spatial audio.
However, we are reaching a point of diminishing returns. Despite these sophisticated systems, nearly 80% of Class A mishaps are still linked to human factors, specifically failures in situational awareness.
Hypoxia is another factor threatening pilot performance and contributing to Class A mishaps, which we have explored in a previous Blog Post. Check it out!
The core of this issue lies in Multiple Resource Theory (MRT).
Developed by Christopher Wickens, MRT argues against the idea that human attention is a single, central reservoir.
Instead, it proposes that the brain possesses several independent "pools" shaped by the type of sense used, the kind of processing required, and the information’s nature (spatial vs.verbal).

MRT visualises human attention as a structured 3D space.
Each axis represents a different way the brain handles information:
Modalities: Information enters through the eyes (Visual) or ears (Auditory).
Codes: Information is processed as either "Spatial" (where things are) or "Verbal" (what things mean).
Stages: We move from Perception (seeing) to Cognition (thinking) to Responding (acting).
Under this framework, multitasking fails when two tasks compete for the same resource pool, while performance is maintained when tasks are distributed across different, underutilised channels.This is a placeholder paragraph. Replace this text with your own content.This neurological framework explains that multitasking performance is determined by whether the tasks draw from the same or different resource pools.
Within-Modality Competition
When two tasks require the same sensory channel, they compete for the same resource pool.
Two auditory signals arriving at the same time (e.g., a radio call and a collision warning), fight for the same perceptual pathway.
But competition can be just as strong when tasks share the same type of processing. Two tasks requiring spatial thinking/processing, such as monitoring aircraft orientation while tracking a moving target, drain the same mental resource just as rapidly/strongly.
The Cognitive Bottleneck
When the visual and auditory channels are saturated, the brain reaches a bottleneck where performance collapses.
Alarms go unheard, and critical displays are "seen" but not consciously processed.
Modality Independence offers an opportunity.
Information arriving via an unused sense (like touch) does not compete for the resources already consumed by sight or sound.
Tactile signals are processed in the somatosensory cortex, a pathway that is typically far less burdened/saturated in aviation, even when visual and auditory channels are saturated.

Cross-Modal Failures: When the Brain Protects Itself
In high-stress environments, the brain's attempt to manage these multiple resources leads to measurable neurological "gating" effects.
When one modality is under extreme demand, the brain suppresses other senses to preserve processing power for the primary task. This often results in two catastrophic phenomena:
Inattentional Blindness: A pilot may fixate on a single display or task (attentional tunneling) and fail to see other critical cues, even though it is within their field of view.
Inattentional Deafness: Under high visual workload, the brain can suppress auditory processing. Research shows that 39.3% of pilots in simulated emergencies failed to register critical auditory alarms because their brains suppressed auditory processing to handle high visual workload (Dehais et al., 2013).
Interested in more data-backed insights like this? Take a look at our Blog Post and Whitepaper!
Essentially, the brain's processing resources are spread across too many competing demands at once. This is when it chooses what information to prioritize, and why elite pilots can experience a total performance collapse despite years of training.
The Operational Implication for Military Aviation
MRT suggests that adding "better" visual alerts to an overloaded cockpit does not solve the problem: it intensifies it.
When a sensory channel is saturated, additional information becomes noise rather than guidance. Effective system design requires redistributing critical information to channels that remain available.
This is not a technology problem. It is an architecture problem.
The solution, therefore, lies in offloading essential cues such as threat direction, aircraft orientation, or system warnings to an independent sensory pathway.
The Somatosensory Solution
Touch provides that pathway.
Unlike visual signals, which require complex optical and cognitive processing, tactile stimuli translate physical pressure directly into neural impulses.
Bottom-Up Processing
Tactile cues are "bottom-up" signals, meaning they draw attention automatically at a pre-conscious level.
They force attention without requiring the effortful, deliberate thinking that stress and oxygen deprivation (hypoxia) tend to destroy.
Sensory Redundancy
By providing a "physical anchor" on the pilot's body, tactile systems create a cross-modal reference frame.
When the visual system and the vestibular system (inner ear) provide conflicting data (causing spatial disorientation) the brain can trust the body-referenced tactile signal as an additional, stable, and instinctive anchor.
Faster Reaction Times
Tactile signals reach the brain roughly 45ms faster than visual stimuli.
At jet speeds of 270 meters per second, that extra 45ms provides crucial meters of decision space.
Superior Accuracy
In high-noise environments where helicopter pilots lose 3D audio clarity (dropping to 53% accuracy), tactile cueing maintains a staggering 91% localization accuracy.
Conclusion: Respecting the Biological Limit
We cannot redesign human biology, but we can redesign our machines to respect it.
Multiple Resource Theory tells us that visual and auditory channels are operating near their limits in the modern cockpit.
By embracing tactile communication, we aren't just adding a new sensor to the cockpit: we are providing a structural adaptation to the realities of human cognition.



Comments