Simulation World Gets in Sync
L-3 Link Simulation and Training used a National Research Council of Canada Bell 412 as part of its research into improving fidelity-measuring standards for simulators.

Simulators could see a revolution in the quality of visuals as new ray-tracing technology comes to the fore and replaces rasterized images, and new multi-core chips make it possible to use multiple projectors simultaneously. That was one of the conclusions to emerge from a Royal Aeronautical Society conference in London on flight simulation research in late November. Whether the increased cost of greater “fidelity” is necessary to meet training goals was another matter for discussion.

Most of the computing power in full-flight simulators can now be replicated on desktop trainers at low cost, so “the next advancement is moving up the credits you can get on desktop trainers,” said Prof. David White, L-3 Link Simulation and Training chief scientist. “The market for us is being driven by the bottom end. It’s a consumer market and all devices are now multi-core!” He added that Intel chips are now 8-core and PowerPCs (Motorola) have gone to 12, with “hyperthreading” (parallel processing to “allow more power from the same silicon”) becoming more popular. However, he reminded listeners that Amdahl’s Law says that parallel processing is not wholly efficient, so “with a 32-core system you may only get a 24x speed-up, but only at 99-percent parallelization.” In fact, he added, “if the efficiency is only 60 percent, you can only get two times the speed from 32 cores.”

Thus very fast chips are becoming available for simulation, often “systems on a chip” (SOCs). He suggested that processing power is all very well, but what is needed is on-chip supervision to use software to best manage kernel time, to enhance the efficiency of the parallel processing. He showed a slide to illustrate an EFB application, warning of obsolescence if supervision software is written for particular chips that later become obsolete. This is the danger with training devices that are then not upgradeable.

Right now, White said, system designers are addressing the major disadvantage of asymmetric multi processing SoCs–that software in one core can “trample” over the memory of another–by having supervision software, or a “hypervisor layer allocating rights between the kernels and providing security.” This would allow more stable running of the SoC and could see, for example, “Linux running on one core and Arinc 653 on another–or any operating system, such as Vxworks,” said White. “So for FSTD architectures you could get much higher performance protected from other people [such as third-party vendors] changing software, and with networking over standard protocols.” However, this does mean that “users and maintainers have to be more IT aware” to get the best out of their systems. “Multicore SoCs are the future but they have to be architectured,” he said. “You can’t just chuck the software in there…and people have to know how to code to get the power.”

Realistic Views with Ray Tracing

The advances in chip technology make possible a movement toward ray tracing technology because, as Simon Skinner, managing director of XPI Simulation, pointed out, parallelization can be “99.99 percent” efficient for this application. Movie graphics already use ray tracing, which gives realistic shadows, reflections and refraction with real detail because it is created by actual light, rather than using vast image databases (which also slow things down to get the detail).

However, ray-tracing is processor intensive and movie frames (CGI graphics) take days to make; moving this toward real-time simulation means 10 to 12 milliseconds to get the image, and the fastest at the moment tends to be three times slower than that, said Skinner, whose company is working on a UK military project called SERTE (with Lockheed Martin and CAE). The chip being used, a Kepler GF110 on an NVIDIA card, has seven billion transistors, 1,536 cores and 6Gb of memory, speeding at 1.5-4.0 terraflops per second. “But we want 10-30 terraflops,” he said, and can get an effective 1,200 to 1,300 cores. This is the power needed to do movie CGI in real time, and the next-generation chip, code-named “Maxwell,” will come closer. Speeds are increasing three-fold every couple of years.

For now, the systems are being pioneered for simulation of the land battle, with realistic fire and atmospheric effects such as fog. “There is too much detail to use this technology in [flight simulation] at the moment. The systems just fall over,” said Skinner. The direct rendering is equivalent to rasterizing at 3.6 billion polygons every second, but the challenge for flight simulation is to do this at 60 frames per second for good fidelity.

Nonetheless, Doug Traill, senior solutions architect with NVIDIA, said that next-generation graphics are incorporating ray tracing. The world’s fastest computer, “Titan,” is at Oak Ridge National Laboratory, Tenn., and is used to simulate real-world physics, he said, using 18,688 NVIDIA Tesla GPUs. “This week 18 Barco projectors are being installed at Oak Ridge so they can visualize Titan’s output, [yet] the code they run on it can run on a PC; the code is portable.” Also, the Silicon Graphics-developed OpenGL reality engine being used is 20 years old and widely available. Using ray tracing, the simulator will draw more pixels than the population of the Earth every two seconds.

Flight simulators need to be at 60 frames per second minimum, but Traill suggested that this will probably head for 120 fps over the next five years. Moving away from the traditional databases by using ray tracing to model real physics is how this will be done.

 

How Much Fidelity Does Flight Training Need?

Owen Wynn, technical program manager with Rockwell Collins, asked whether all this “realism” is necessary for flight training, and suggested that less costly devices and more of them are what is needed for training, especially in general aviation. Commercial off-the-shelf (Cots) technology is becoming the de facto way to go, he said, because of the pace of change and the fact that it is economically compelling to piggy-back on the gaming and other consumer industries. His message was that “simulation is a small industry that can no longer justify the cost of independent R&D. We’re becoming a cost-based industry as Cots is the main driver now,” with through-life costs becoming increasingly important. By way of example, he listed some digital cinema projectors now available to simulator builders.

During a lively Q&A session, one respondent pointed out that “Cots may be driving down the cost of simulators but the cost of data is rocketing.” Wynn commented that a level-D full-flight simulator should be $100,000 not $20 million. The assumption tends to be, another delegate pointed out, that the industry accepts this price point because the cost of an accident is so high. Also for training, there is still a ratio of perhaps 40:1 between the cost of training in an airliner and in a simulator.

Dr. Sunjoo Advani, president of IDT Engineering, said that the top three “wants” in simulation and training are validating fidelity requirements, measuring training effectiveness and upset recovery training. Real aircraft have to be used for upset recovery training, he suggested, as the realism is just not there in simulators and extending the flight models accurately outside the norm would be prohibitively expensive (to say nothing of the real problem of getting instructors that have the necessary experience).

White and his team at Liverpool University conducted a fidelity assessment of flight simulators in an attempt to improve standards of measuring fidelity. After work that included flight-testing in a Bell 412 with several test pilots from Canada’s National Research Council and comparisons with the simulator in Liverpool, the team proposed a new fidelity rate scale for helicopters. In subsequent questioning, White made it clear that getting consistent subjective feedback for comparison from test pilots is almost impossible, but this is one of the main reasons there is a drive to establish objective standards so that repeated assessments can be avoided. Work is ongoing through the EU-funded Garteur framework, involving AgustaWestland, CAE and various research establishments.

Real-World Feel in the Sim

While visual systems are heading for a revolution, for the pilot there is much more to “fidelity:” the motion system has to be realistic, too, and in sync with the visuals. Back in the world of current-production simulators, Dr. Robert Armstrong, senior engineer with L-3 Link, described the current push to create a new simulator qualification methodology as initiated by ICAO 9625 Edition 3, the current qualification standard followed by regulators. This is no simple task, since motion cueing has to create the most realistic simulation possible from only a six-degrees-of-freedom Stewart Platform motion system with limited travel in translation and rotation. Gravity is used to simulate sustained acceleration, for example, so rejected takeoffs are the most challenging events to replicate.

The new objective motion cueing test should allow consistent evaluation of the entire motion-drive algorithm of the simulator to help engineers do the comparison objectively, although Armstrong acknowledges that subjective input from pilots will still be needed.

Jim Takats, managing director of Opinicus, who led the Motion Task Team of the RAeS International Working Group to create ICAO 9625 Edition 3, said the purpose of the effort is to address the fact that “Current regulations are increasingly out of step” with modern simulators. Ruud Hosman of AMS Consult, who was also involved in the task team, explained, “We needed an objective motion cueing test. Since the introduction of six-degrees-of-freedom motion systems, subjective tuning has been used, [which is] not consistently reproducible or reliable. So this test supports MDA adjustment. The test was published in 9625 Edition 3 so we can see what the state of the art in motion is from the manufacturers.”

He explained that some results are in–“We are just waiting for the results from two labs and one simulator manufacturer”–but already it appears that “all the partners are saying that it is a valuable and powerful test that will remove the reliance on only subjective judgment.” It will effectively give guidance for tuning that will make the simulator much more like the real thing even before a pilot has tried it.

Takats concluded, “We spend a lot of time tuning simulators, and manufacturers need to prove to regulators that results are the same between devices–so we won’t need to re-tune simulators all the time. We do the first one, and then the rest are the same–like aircraft.” The culmination of this almost five-year effort could result in a real advance for producers and operators of simulators.