Baffling to anyone other than pilots and aerospace engineers, the rows of displays and controls condense vital information to enable safe operation of the aircraft – and mission completion.
The cockpit of the future will be decidedly more minimalist. Projects such as the Lockheed Martin F-35 have already streamlined controls, introducing large customisable touchscreens, but upcoming fighters will take the approach even further.
The British-led Tempest project, for example, will replace most of the aircraft’s physical controls with augmented-reality and virtual-reality (AR and VR) systems projected directly inside the visor of a pilot’s helmet. Known as the ‘wearable cockpit’, the technology is in development at BAE Systems. Developers hope the system, designed to provide pilots and ground operators with split-second advantage, will also provide ‘instant’ configurability before missions.
The project was inspired by BAE’s previous work on fighter-jet cockpits. Engineers found that controls and displays quickly became at a premium when it came to upgrading aircraft capabilities.
“Every time we want to upgrade a sensor or add a new weapon to the airframe, we’re scrabbling around almost for buttons, and for display space, to try and make it still coherent for the operator,” said Suzy Broadbent, human factors manager at BAE Systems Air. “We’re running out of space – the technology is moving so fast, and the weapons and sensors are moving so fast, that there’s more information than we could display.”
To tackle the issue of dwindling cockpit real estate, the Tempest team decided to focus on virtual and augmented displays, and other technology such as gaming controllers. The focus on software instead of hardware is designed to allow quick reconfigurations without costly groundings.
An artist's impression of how the virtual cockpit could look (Credit: BAE Systems)
This ‘common cockpit’ approach linked up with work at BAE Systems Rochester, where engineers were integrating the aircraft head-up display into the new Striker helmet to overlay key information over the pilot’s view of the outside world. The second Striker helmet introduced full colour, making the ‘wearable cockpit’ system more feasible for integration within actual flight hardware – and allowing for a radical reinterpretation of the physical cockpit. “When you look into the cockpit, you’ll see a cockpit there, but it really will just be black in the cockpit itself – no real buttons or displays, but all overlaid there virtually,” said Broadbent.
The aircraft will still include a joystick on one side and throttle on the other, each with numerous buttons for interaction with the system – if needed for safety reasons or during high G-force manoeuvres – but the wearable cockpit will also use things such as eye tracking, hand gestures and voice controls. Eye tracking will enlarge virtual displays when the pilot focuses on them, for example, to prevent information overload. A pilot’s gaze could also change the ‘mode’ of certain buttons, while gestures could ‘pick up’ and move displays.
Perhaps the most futuristic element of the project is the potential inclusion of a virtual co-pilot. BAE is using its experience with autonomous aircraft to introduce adaptive autonomy within Tempest – the system will share tasks between the pilot and computer to best manage a situation.
“We want to end up with this kind of adaptive autonomy, where, depending on how overloaded the pilot is, the information can switch priorities between the human and the machine to make sure that the task is completed,” said Broadbent.
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.