Honeywell Demonstrates ‘Mind Control’
The engine and avionics manufacturer is experimenting with new ways for pilots to interact with their aircraft.
One of Honeywell King Airs was modified to accept "brainwave" flight control as the company moved its trans-cranial neural sensing and characterization system from the flight simulator to real aircraft flight.

This month’s long-anticipated release of the most recent chapter in the Star Wars saga will no doubt captivate audiences with scenes of futuristic spacecraft skirmishing in a sub-orbital dogfight, their pilots’ hands firmly on the controls, as they have been since the very first aerial combat. Yet researchers at Honeywell are working on something even diehard science fiction fans would find astonishing: an aircraft control system that responds to mental commands. At the recent NBAA Convention in Las Vegas, guests at the company’s annual Aviation Forecast dinner were able to gaze into the engine and avionics manufacturer’s “crystal ball” and see technologies that it believes are on the horizon, among them trans-cranial neural sensing and characterization, which receives and analyzes faint signals from parts of the brain near the skull, using a specialized headset.


The company initially developed the technology in conjunction with one of the U.S. intelligence agencies in the early 2000s to enhance analysts’ ability to sort through reams of satellite images looking for specific “targets of interest.” The human mind can recognize images faster than it can process them; the sensors were able to determine a match on images that flashed at speeds of up to 15 per second. According to Bob Witwer, Honeywell’s vice president of advanced technology, use of the system raised the analysts’ throughput as much as ten-fold.


Human-machine Interface


With that experience in mind, researchers began to look for other applications such as the ways humans interact with machines. “One of the things that we’re really focused on at Honeywell Aerospace, and certainly with my team, is recognizing that we’ve got more human-machine interface modalities available to us than we’ve ever had before,” Witwer told AIN. “People expect to be able to interact with a computer in some cases by touching or by using their voice, so neural technology from that point of view is just another potential modality for interacting with machines.”


The Arizona-based company has applied that technology to aircraft controls for more than a decade, with tests first on a Boeing 737 flight simulator and then this summer in actual flight-testing on one of its modified King Airs. The research has been successful, with the company conducting 10 flights thus far. In each case, the pilot was trained to associate patterns of lights flashing on a control panel grid with the desired movement of the airplane. When his brain recognizes the flash of the lights across the top of the grid, for example matching his intention of making the aircraft climb, the neural-sensing headset reads that faint recognition signal and translates it to the aircraft controls.


Given this is just a demonstration of the neural sensing capability, the Honeywell advanced technology staff is still feeling its way regarding the best means of interaction. “I don’t know that this would be the specific implementation that we would use,” explained Witwer. “In fact, it wouldn’t be my preference because having lights flash in a cockpit is not something pilots normally like to see.” Another set of experiments set to begin this spring will attempt to sample impulses from another brain structure, the supplementary motor cortex, where the mind plans movement. “What if I imagine that I’m going to move my left elbow up, and if I move my left elbow up that means I want the airplane to bank to the right?” Witwer said. “You don’t have to move anything; you just imagine the movement. We start that in the supplementary motor cortex and then we can control the airplane.”


Witwer stresses that these modalities are not necessarily the direction Honeywell is moving in terms of primary human-machine interfaces. Rather, they could have application in a supplementary or even emergency capacity, where the instantaneous recognition of an abnormality and response could help avert further danger.


“Even though it sounds really cool, like science fiction, the truth is from our point of view, most of the stuff that we are doing in our industry today was science fiction at some point in the not-so-distant past,” said Witwer, noting that such now commonplace advances as GPS, cellphones and personal tablet computers did not exist when he joined the industry in 1980.