Honeywell Aerospace is researching a series of advanced technologies, from sensors that can tap into brainwaves to control basic aircraft maneuvers to speech-recognition equipment and synthetic-vision advancements, to create a cockpit environment in which the pilot could use any number of means to perform a mission. Bob Witwer, v-p of advanced technology for Honeywell Aerospace, said the company’s goal is to “give pilots what they need, only what they need and when they need it.”
The effort is examining a series of inputs from visual, aural, manual control and automatic control to, possibly one day, mind control. These options must be intuitive, unambiguous and easy to understand, Witwer said. The idea is to “design the modality to match the mission,” he said. Witwer cited as an example the proliferation of touchscreens, which he said is a welcome advancement, but they can be difficult to use in turbulence. If the hands are busy, he suggested, speech-recognition capabilities might be a better option.
Honeywell recently opened some of its advanced technology labs to give reporters a glimpse at some of the research under way to create this cockpit environment. Some of the research projects coming out of the labs are either already in service or are mature and close to moving into production and marketing. Others are in the early stages, and it might be years before they reach the market. Even if they reach the market they might be in different forms.
One of the more far-reaching avenues of research is the neurotechnology demonstration. In this lab Honeywell demonstrated real-time neural control of an aircraft simulator. The company has also demonstrated the use of neural control in a King Air, using the inputs to command basic pitch and roll functions.
This demonstration used sensors attached to the inside of a helmet or some other head covering that can detect electrical impulses in the brain and send them to the avionics. The operator looks at nine arrow directions (such as up, down, up to right), focusing on one. That sends a signal for the simulator, or the King Air, to move in that direction.
Honeywell Aerospace scientist Santosh Mathan said this research might open possibilities in the study of human factors, such as brain response during loss of control, to training, or to performing basic functions in the cockpit such as pulling up a certain map. The research has applications outside aerospace, such as how the brain functions after a stroke or chemotherapy. Honeywell has been partnering or is in discussions about the research with a number of entities, from the U.S. Defense Advanced Research Projects Agency to universities.
Another of the company’s labs is focusing on oral processes. The speech-recognition lab is studying not only possibilities for voice command, but also transcriptions of ATC dialog to augment those communications. The transcriptions would provide quick reference of commands, particularly in critical phases of flight when workload introduces the potential for distraction. First detailed during NBAA’s annual meeting last fall, the ATC research has been under way for three or four years. The research must iron out differences in language, dialect and accents. As for the aural command, which is tested in a sound room that can simulate the sound of a cockpit in flight (for demonstration purposes researchers used the typical background noise of a Falcon 900), Honeywell sees opportunities to enable the pilot to use infrequently used commands or call up deeply buried menus.
Honeywell’s flight simulator lab continues to advance the use of synthetic vision and combined vision to improve situational awareness in poor weather. Some of these technologies will be folded into Honeywell’s SmartView suite. A synthetic visual aid for taxiing is mature to the point that it is nearly ready to move from research to production. The display will depict a “parasail” view of an airport runway, showing important markings and runway identifiers that can be hard to see in poor weather. The display will also pop up a virtual barrier when the aircraft taxis to a hold-short point and drop the barrier when the aircraft has been released.
Another lab simulates cockpit motion that has been used to improve touchscreen technology. The lab can test pilot responses and even measure muscle input to track fatigue. It can also track where a pilot looks when confronted with various tasks, helping with human-factors research. By evaluating the range of responses, it paints a picture of how “everything flows together” in the broader cockpit environment.