The year is 2035, and an RAF fastjet pilot walks across the pan to her Tempest aircraft.
Externally it looks quite like today’s latest stealth combat aircraft, but internally it isn’t. It can be operated pilotlessly from the ground and has massive autonomous operating capability.
So why is the pilot walking out to it?
Anyway, she climbs the ladder to the cockpit, the patient groundcrew perched there ready to help her strap in. But what she steps into is a featureless hole with an ejection seat in it. The seat has the usual harness, a pair of tube connections for oxygen and the G-suit, and cable jacks to connect to her helmet systems. The only sign of anything to interact with are two physical input devices, one for each hand, with just a few mode-selector buttons and a scrolling ball on them. There are no instrument displays, avionics or switches.
When she connects her helmet to the aircraft, a virtual cockpit suddenly surrounds her, projected on her visor so its components – and the outside world – are delivered in collimated visual form (infinity-focused) wherever she looks .
The mission, from brakes off to chocks-in is conducted by gesture-control in a synthetic visual world, with 360deg directional sound and vision in all three planes, so she can “hear” approaching threat cues as well as see them in the holographic world within her helmet. Directional sound cues enhance her awareness of battlespace threats – colour-coded on the visual display as friendly, hostile or unknown. The canopy above her head is perspex, but the need for eyeball contact with the real airspace environment and its occupants is debatable.
The computing power available to the pilot is formidable. With advanced artificial intelligence (AI) capacity provided by chips that imitate the human brain’s neural processes, the aircraft has the ability to take decisions autonomously on behalf of its commander – like setting electronic warfare modes, zapping inbound missiles, adjusting mission priorities, selecting targets, arming appropriate weapons, firing them, and controlling the aircraft trajectory.
Flight trajectory guidance no longer relies on GPS – it’s too easy to jam or corrupt – so it’s directed by quantum sensors, the precision successor to inertial navigation systems.
Back to the original question: what’s the pilot doing there in this deadly environment?
“Team Tempest” is an experimental programme designed to find out the answers to questions like that, to push technology to its limits and above all to innovate, on the basis that innovation is the most effective way to keep military capability ahead of the enemy.
Team Tempest is led by BAE Systems in partnership with MBDA (weapons systems), Rolls-Royce (propulsion) and Leonardo (sensor systems), backed by the UK Ministry of Defence. It was unveiled on the first day of the Farnborough Air Show last week at BAE’s showroom there, creating considerable industry interest and signaling that BAE intends to be a major player in future combat air systems (FCAS), albeit with partners – almost certainly European ones.
The projected aircraft would have a stealthy design augmented by electronic warfare capability intended to keep it invisible in its airspace while able to track, identify and target traffic, operating as part of a battle group that has surveillance capability.
This provides a formidable task for the human at the centre of it, even aided by AI.
The fact that AI is able to control a formidable weapons system is arguably the reason for needing to put a pilot in the decision-making and control loop. The science-fiction nightmare of intelligent robotic systems taking over powerful weaponry means that limits have to be set as to what AI-managed systems are capable of doing. Presumably the pilot could elect to remove the limits. Warfare is not just technology.
Prof Nick Colosimo, in charge of “disruptive technologies” at BAE Systems, qualifies the use of AI in future combat air systems, saying it will be “trusted, scalable, human-in-loop”. His approach has a tentative, experimental feel to it.
Meanwhile, if the RAF is going to put its pilots in charge of the Tempest in 2035, whether in the aircraft or remotely controlling the system from the ground, what training will they need, and what tools will enable them to stay in full cognitive control of the aircraft’s mission while allowing AI to direct it? Can the human brain keep up with the AI?
Team Tempest’s task is to provide answers to those questions to enable significant system upgrades for the Typhoon and Lightning through their service life, and eventually replace them with the next generation of combat air systems.
Keeping humans in effective control is the parallel task.