Honeywell Advanced Technology Highlighted During Lab Visit

Aviation International News » April 2014
April 2, 2014, 12:45 AM

In January, Honeywell opened the doors of its advanced-technology facility in Deer Valley, Ariz., and shared details of what its engineers and scientists are exploring for possible use in future aircraft programs. These included tests on touchscreen controls, gesture-based avionics manipulation, haptic feedback devices, voice controls and even transcranial neural sensing.

Few of these human-machine interfaces will appear in any cockpits soon, but Honeywell’s experts are exploring new avenues toward making aircraft safer and more efficient.

As a point of comparison, Jary Engels, Honeywell chief test pilot, flight operations, hosted a ride in the company’s new G650, which provided a look at the current state of the avionics art. In the context of development capability and certification constraints, the G650’s Honeywell Primus Epic-based PlaneView flight deck is one of the current modern standards. And because Honeywell makes so many other components such as the APU, environmental control system and pressurization system and satcom, the G650 incorporates many inter-connection features through which various products share information. “Gulfstream is as invested in advanced design as anybody,” said Mike Rowley, Honeywell vice president of worldwide sales.

For pilots who are accustomed to interacting with avionics via knobs and buttons, Primus Epic moves them into the world of cursor control. The fundamental design, Engels explained, is an “object-action user interface.” That means that moving the cursor over an object on a screen then clicking on that object reveals what can be done with it. For example, a waypoint on a flight plan might reveal a list of actions that can be taken, including setting up a hold or altitude restriction. This eliminates the need for the pilot to drill down through layers of menus to take those actions.

Another advantage of this so-called graphical flight planning is that both pilots can easily see these actions on the big displays and verify that the correct action was done before implementing the change. This is much more transparent than trying to interpret each other’s finger-pushing on an FMS mounted in the pedestal.

Big-display Benefits

Looking at the Honeywell SmartView synthetic vision system (SVS) in the G650 revealed the real benefit of large LCDs in cockpits. Turning the SVS on and off made apparent the improvements offered by the animated outside view that the database-driven system provides. The profile view of the planned flight on the bottom of the MFD adds even more situational awareness, especially when flying around the mountainous areas surrounding Phoenix.

As Bob Witwer, Honeywell’s vice president of advanced technology, explained when asked about SVS, “We’re not going to go back. Nobody wants to view a 2-D display.”

The G650 is equipped with a Rockwell Collins HUD II head-up display in the left-seat position. But Honeywell also replicates HUD symbology on the PFDs, with the flight-path marker showing where the aircraft is going just as the HUD does. When the aircraft is nearing the destination airport, the PFD also shows a runway centerline leading into a runway outline symbol, all superimposed on the SVS view of the terrain and outside world, further reinforcing situational awareness.

Honeywell is working on getting lower landing minimum credit (down to 100 feet) for synthetic vision displays, and this will be called a synthetic vision landing system (SVLM). Lower minimums are already available for enhanced flight vision systems (EFVS), with which the G650 is equipped.

“For a normal approach to 200 feet,” Engels explained, “there’s no special training required beyond the instrument rating.” What SVLM technology does, he noted, is give the pilot a more natural way to see an approach. “If you look at the instruments with which we used to fly approaches, we had to interpret the flat [2D] display and follow the needles, and at 200 feet the pilot had to look up, providing only 200 feet to make that transition to the outside world. You need to have this picture in your head,” he explained.

Honeywell’s focus on getting SVLM approved for lower minimums stems from its research into flight operations in poor weather close to the ground, as is the case at the end of an instrument approach. HUD pilots like being able to see the outside world through the HUD combiner glass in front of their faces, while also seeing the HUD symbology showing flight instruments and avionics guidance on the glass.

“We took that [symbology] and put it on a head-down display,” Engels said. “Now when I get to 200 feet I already know what I’m going to be seeing.” What he means is that the SVS shows what the outside world looks like, and if the SVLM has a high degree of fidelity, the runway depicted on the SVS will exactly match the runway that appears out of the mist when the pilot looks up and outside the windshield at 100 feet. “The display puts me in a more accurate position and enables me to go lower with the same level of safety I had before,” he said. “That’s what we’re trying to get credit for. [SVS] now has a tangible value: you can land when you couldn’t before.”

Honeywell is also working on a combination of EFVS and SVS, called a combined vision system (CVS), and has done many hours of testing in simulators and airplanes. In a CVS, the infrared (or other technology) imagery from EFVS is superimposed on the SVS view, so the pilot sees not only the SVS animated view but also what is really there–the runway environment as well as hazards that could be on the runway such as animals, other aircraft or vehicles. Obviously SVS doesn’t show these elements, although with ADS-B in SVS could depict ADS-B out-equipped vehicles.

Real-life Taxi Display

Honeywell showed AIN a further advancement in these technologies, a new system that takes advantage of SVS after landing or before takeoff, when taxiing. Pilots are now well accustomed to flying the magenta line on graphical displays, but what if this concept could be adapted to ground operations? “Once we have ADS-B, we can see all vehicles [on the airport],” said Rakesh Jha, the advanced-technology facility’s director of crew interface and platform systems.

Many aircraft, including the G650, already show the strategic view of the airport surface, a taxi diagram with own-ship position display so pilots can see their exact location on the airport. But that doesn’t provide any information about other aircraft or vehicles or any guidance to the destination on the surface.

What Honeywell has developed and hopes to bring to market is a 3-D view of the surface environment, basically synthetic vision optimized for airport operations. Pilots would be able to input a “taxi plan” that paints a magenta line on the airport chart so they know exactly where to go on the airport, and pilots could either create the route or download it from ATC via a datalink.

This presents the pilot with two screens of information: the left side is the typical PFD but with a 3-D SVS view from an exocentric point in space above and behind the airplane. As can be seen in the above video, the magenta line shows the route, and the designation for the taxiway that the aircraft is on is “embedded” on the line. A red-and-white segmented circle highlights airport hotspots where pilots must pay extra attention. When the aircraft is approaching an intersection, the signs labeling the intersecting taxiways fade in, and they are positioned in the middle of the virtual concrete so they are difficult to misinterpret. In the SVS world, signs can be placed in useful spots and aren’t bound by real-world constraints such as size (you wouldn’t want to scrape them with a wing) and location (you don’t want a physical sign standing in the middle of a taxiway). When flying with this system, pilots will see a transition from regular in-flight SVS to the ground system after touchdown or vice versa for takeoff.

The opportunities for safety and efficiency offered by this system are tremendous, according to Witwer. “We’ve made huge strides in automated flight, but not on the ground,” he said. Honeywell has developed an electric taxi system that can move an airplane to and from the runway without having to run the airplane’s engines. “Once we have a taxi plan, we could use it to direct an electric taxi system.”

Rethinking the Human-machine Interface

Witwer and Jha showed AIN more of Honeywell’s explorations into human-machine interfaces. “What Rakesh’s team is doing is enhancing the pilot experience,” Witwer explained, “making it safer and more intuitive.”

Honeywell scientists have been experimenting with using neural signals to control a Boeing 737 simulator, for instance, using “transcranial neural sensing and characterization.” As Witwer explained, summoning the action required for a pilot to push a button relies on a complicated combination of well orchestrated activity back and forth from brain to nerves to muscles. One significant challenge with such an interface is differentiating between whether the brain wants to do something or is just contemplating the action. “What about thinking of the intent to move?” he asked. “We’ve been working on this field for about eight years. But it’s not going into a cockpit soon.”

Another modality that the Honeywell scientists are testing, which might have an application sooner than neural sensing, is voice control. This is already available for limited applications, such as frequency selection using Garmin’s Telligence system, but Honeywell sees wider possibilities. One application demonstrated for us was use of voice commands during taxiing, telling the taxi plan system where to go (“apron 2” or “gate 11”) and so on. This could also be combined with a touchscreen, so the pilot would simply touch the destination on the airport to update the taxi plan. Or it could be a combination of commands, using voice to highlight the apron then touch for the specific gate. Voice-recognition technology could help ATC know who is calling from which aircraft, too. “We’re experimenting,” said Witwer.

Haptic feedback, where a knob or switch or even a throttle gives a physical reaction to the user, offers potential benefits, too. A haptic throttle, for example, could add resistance if the pilot tried to use too much power while taxiing. “This could be good for training,” Jha said.

“A stick shaker is haptics,” Witwer pointed out. “[Haptics] is just another modality.”

After volunteering to try a demonstration of a touchless eye-tracking and gesture control system, I had to remove my glasses so the system could see my eyeballs. The setup involved using my eyes to make selections and my hands, in the air, to swipe between screens. Manipulating the screens without touching anything was interesting, and I can see how there might be practical cockpit applications. Touchscreen displays, for example, aren’t considered useful for large aircraft where pilots sit fairly far back from the instrument panel. Something like Honeywell’s touchless system could be the solution; or it could be a separate touchscreen controller, similar to what Garmin employs in its G2000 through G5000 avionics.

Indeed, the Honeywell lab was in the midst of testing a touchscreen controller during our visit, and we were able to watch a pilot using the controller in turbulence in Honeywell’s crew-interface motion simulator for a real-world workout. “The pilot is flying a mission using various controls and touch,” Jha explained. “As the pilot is doing that, we use standard experiment design [protocols] to measure how comfortable touch is in turbulence, whether it is tiring and so on. We can program the level of turbulence. At the end of the experiment, it gives us a subjective assessment of pilot fatigue and other factors.

“We don’t want to put touch in for touch’s sake,” he said. “All these devices have to earn their way into the cockpit.” The touchscreen controller will likely be one of the first devices to work its way out of the Honeywell lab as it is destined for Embraer’s second-generation E2 regional jets, slated to enter service in 2018.

Honeywell isn’t just trying to serve the needs of today’s traditional two-pilot flight crew but is also exploring future flight decks. “The reason you hear me beating the drum about matching the modality to the function is that some are easy and some make you not want to use the modality,” Witwer said. “The really big challenge and the one we’re going to put a lot of attention on is understanding the mission deeply.” However, he added, “The bigger question is what does the cockpit of the future look like?”

Are two pilots really necessary? What about one pilot in the aircraft and one backing up that pilot and others from the ground? With so many sensors installed in modern aircraft, do pilots even need to sit at the front of the airplane behind a vulnerable windshield, or could they fly from a pilot station embedded elsewhere in the aircraft, perhaps in a secure location with a separate entry door. And what about passenger reaction to having only one pilot on board? In parts of the world where people have never flown before, passengers might not have a bias about how many pilots ought to be in the cockpit, Witwer explained. “They may have an easier time getting over that hurdle. It may start with cargo. There are a lot of ways this is likely going to progress.”

“But it’s not going to happen anytime soon,” Jha concluded.

FILED UNDER: 
Share this...

Please Register

In order to leave comments you will now need to be a registered user. This change in policy is to protect our site from an increased number of spam comments. Additionally, in the near future you will be able to better manage your AIN subscriptions via this registration system. If you already have an account, click here to log in. Otherwise, click here to register.

 
X