Avionics Digital Edition
Found inFeature

Avionics Supercomputers of the Future

Three different types of supercomputers, along with machine learning, will impact the next generation of commercial and military aircraft.

Photo courtesy of GE Aviation

The advancement of commercial off-the-shelf (COTS) embedded technologies has created supercomputers with increased processing power, data rates and speeds with better size, weight and power (SWAP) requirements. Compared with traditional 19-inch rackmount systems, the integrated approach provides more powerful functions in a much smaller package.

The ROCK-2 system from Mercury Systems designed for avionics missions is a good example. It measures (without connectors) just 4.88 by 7.63 by up to 13.18 inches. Yet within this unit, it allows multiple plug-in boards: CPU, video capture, Ethernet switch and storage. The chassis weighs 4 to 8 kg depending on how many boards are plugged in.

Even though avionics includes some fully autonomous flying, it is on a limited basis. With more and more powerful embedded computing technologies being developed, it is expected that full autonomous commercial flights will someday be in place, powered by advanced supercomputers in the future.

“Commercial aircraft are already highly autonomous vehicles, having included autopilot functions for decades,” said Robert Atkinson, director of business development for Mercury Systems’ mission systems group. “We expect that most commercial aircraft will have fully autonomous capability from takeoff to landing, including sense and avoid functions, within the next 10 to 20 years.”

But public perception and regulatory issues will not bring completely pilotless flights in that timeframe, he added.

Areas on a plane that need monitoring. Devices such as the air data computer module (ADCM) can provide real-time monitoring data.

Mercury Systems, a pioneer in sensor processing and embedded computing for the aerospace and defense industry, provides modules, single board computers (SBC) and fully integrated sensor and mission processing subsystems. With the acquisition of parts of Microsemi and other companies, Mercury now also includes components in its product portfolio. Additionally, its product road map aligns to the DO-297 standard for integrated modular avionics (IMA), focused on integrating multiple functions into fewer computing units. Its recent introduction of the BuiltSAFE ROCK-2 series of avionics computers is one example of a modern supercomputer supporting the digital convergence that the aviation industry is going through. (Similar to the iPhone, you can package a phone, camera, GPS and more in one single unit.)

The rugged, modular subsystem is designed for use in a variety of avionics and display applications, including command, control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR) and is available with industry-standard DO-178C/DO-254 safety-certification evidence. Based on the industry-standard 3U VPX architecture, the ROCK-2 system can host up to four boards, allowing for customization to the user’s unique I/O and graphics requirements.

Multiple avionics I/O interfaces and advanced video and graphics processing boards can be housed on the ROCK-2 system. Mercury’s innovative design of placing CPUs on the mezzanine site of the 3U VPX boards allows for up to four CPUs to be used. Placing the CPUs on the XMC site simplifies CPU upgrade, minimizing recertification and total lifecycle system costs. Mercury currently supports a variety of NXP QorIQ and Intel Core i7 processors. The sealed, forced-air, conduction-cooled chassis (negative 40 to 70 deg C operating) does not have any moving parts, such as cooling fans. As a result, the design has higher reliability and also represents a major improvement of the SWAP requirements.

Image Processing and Real-Time Data Capturing

California-based Aitech Defense Systems, Inc. has worked with multiple software partners including Green Hill, LynuxWorks and Wind River to provide embedded computing to the industrial, aerospace and defense segments for more than two decades. Both commercial and military planes now require faster imaging processing as a precursor to the development of machine vision, in hopes that planes will soon be able to recognize images of nearby flying objects to increase safety and collision avoidance.

Based on the rugged general purpose graphic processor unit (GPGPU) Jetson TX2, a system-on-module (SoM) made by NVIDIA, Aitech’s A176 Cyclone is capable of machine learning from the software library furnished by NVIDIA. The new design has doubled the performance of the previous version. The unit measures 20 cubic inches (25 cubic inches with the 1 terabyte of solid-state memory), provides 1 teraflop of parallel processing power and supports multiple hardware I/O and software options, including MIL-STD-1553, ARINC 429 and camera link frame grabber.

The BuiltSAFE AVIO-2353 is an avionics communications I/O interface board that supports multiple I/Os such as MIL-STD-1553.

Additionally, the A176 Cyclone includes an internal microSD storage to perform data convolutions and transpositions, image and data manipulation, image and frame object-edge detection and recognition. This small form factor (SFF) embedded computer (HPEC) design is fanless and conduction-cooled and consumes fewer than 10 watts. The unit demonstrates the improvement of processor performance with high-density memory and image-processing capability.

Together with the machine-learning capability from Jetson and an HD camera, the A176 Cyclone is able to capture images in real time for unmanned aircraft systems. Additionally, in the future, these units can be joined together to form a neural network to provide more supercomputer power.

“Embedded technology is empowering the future development of Avionics. Future machine vision will be much more powerful than what is available today,” said Doug Patterson, VP of the military and aerospace business sector at Aitech. “For example, wide-angle and narrow-angle cameras will be equipped with artificial intelligence to detect moving objects in the field, such as a person crossing the field, and the resolution of the HD-camera-captured pictures can be adjusted down to the pixel level. Additionally, the detection can all be done automatically without human intervention.”

Open Flight Deck Enables Future Upgrades

The Open Flight Deck architecture defines the standards and interfaces, and functional apps can be developed and deployed in planes such as the Boeing 787 and the Gulfstream G500/600.

The first project, called the Common Core System (CCS), was pioneered on the Boeing 787. Within this architecture, various suppliers can provide plug-in subsystems to provide different functions. GE Aviation continues to work on new developments of modules to support the architecture. It includes deep learning using general processing units (GPUs) or field programmable gate arrays (FPGAs). This plug-and-play approach will continue to improve SWAP and make upgrades more cost-effective.

“In this next phase, we are taking the open platform approach to the avionics system that we pioneered on the Boeing 787 and applying this to the flight deck,” said Alan Caslavka, president of avionics for GE Aviation. “Open Flight Deck will deliver order-of-magnitude reductions in the cost of change by enabling regular upgrades of flight deck applications.”

The Importance of Software

Wind River pointed out the unique convergence between automotive and aircraft systems. Due to safety requirements, the aviation industry has paved the way for the autonomous driving industry. But the aircraft and avionics platforms do not have the volumes independently to drive substantial innovation and investment in the massive computer systems required to deploy fully autonomous systems that need to have proven safety and security credentials. So, moving forward, both industries can potentially share knowledge about innovations.

Automakers and technology companies are working hard to develop fully autonomous driving. This requires artificial intelligence, sensors like LIDAR, radar and cameras, and vehicle-to-vehicle communication working together.

It is hoped that the aviation industry will implement similar innovation for planes to detect and communicate with each other and with airports.

“Learning from the autonomous driving industry, avionics will be implementing machine-learning with better sensors such as LIDAR and cameras to get the early warning of nearby objects such as birds approaching,” said Chip Downing, director of aerospace and defense business at Wind River. Downing said those solutions will come from partnerships with hardware and software solution providers.

An Autonomous Future?

We witness the convergence of automotive and aircraft systems, and avionics will have a lot to gain from the development of autonomous driving. It is expected that future development of these avionics supercomputers will incorporate the functions of artificial intelligence with sensors support getting ready for full autonomous flying. AVS