Future Military AI/ML: Multi-Plane Collaboration, Precision Targeting, Full Autonomy

As governments and the defense industry project the expansion of artificial intelligence/machine learning (AI/ML) into new use areas in the next decade, future applications of artificial intelligence and machine learning (AI/ML) may include multi-aircraft collaboration, precision targeting, and fully autonomous operations in denied communications environments.

As governments and the defense industry project the expansion of artificial intelligence/machine learning (AI/ML) into new use areas in the next decade, future applications of artificial intelligence and machine learning (AI/ML) may include multi-aircraft collaboration, precision targeting, and fully autonomous operations in denied communications environments.

“The future for the U.S. and our allies requires moving to a connected battlespace, where information flows between entities and across all domains,” George Hellstern, program manager for artificial intelligence and autonomy at Lockheed Martin Skunk Works in Palmdale, Calif., wrote in an email to Avionics International. “This will create an abundance of data that has to be managed and processed before decisions can be made. AI can sift through the vast amounts of information providing the essentials to pilots and commanders, allowing them to make faster, more informed decisions.”

Last May, Skunk Works and the U.S. Air Force Test Pilot School at Edwards AFB, Calif. demonstrated an autonomous intelligence, surveillance, and reconnaissance (ISR) system that is to work in anti-access, area-denial environments in which adversaries are likely to mount communications denial attacks on U.S. and allied forces.

Air Force photo by Ethan Wagner. An F-16 Fighting Falcon flown by Maj. Jacob Schonig from the 416th Flight Test Squadron at Edwards Air Force Base, California, conducts a captive-carry flight test with a Gray Wolf cruise missile prototype over the Pacific Ocean, June 9, 2020.

The autonomous ISR system, integrated on a Lockheed Martin-developed pod on an F-16 fighter, detected and identified the location of the target, automatically routed the aircraft to the target, and provided imagery to confirm the target in a simulated, denied communications environment.

“As the battlespace becomes increasingly contested, human-machine teams will enable operators to collect critical intelligence in denied communications environments, ensuring our warfighters get information they need when they need it,” per Hellstern.

Skunk Works is also developing a missile-avoidance system that is to be able to pinpoint which aircraft in a formation is the target of an attack and which evasive actions are needed—features that now require human interpretation of several data displays.

“Our approach to implement autonomy and AI is to be a capability multiplier,” Hellstern wrote. “We want to assemble the best team for the mission, and we believe that involves combining the strengths of humans and machines. Some tasks machines perform better, such as sorting through large amounts of data for specific information in a limited amount of time, while the human may know the best way to use the data once it is compiled.”

In February, General Atomics Aeronautical Systems, Inc. (GA-ASI) flew a company-owned MQ-9A Block 5 Remotely Piloted Aircraft (RPA) equipped with a newly developed Centerline Avionics Bay (CAB), designed to save space for new AI/ML-enabled, open architecture avionics, such as a company-developed detect and avoid system.

GA-ASI said that it built CAB to provide additional volume, platform infrastructure, and cooling provisions for integrating high-performance computing systems on MQ-9 Block 1 and Block 5 RPA. The CAB is to permit the MQ-9 to host government Open Mission Systems (OMS)-compliant AI/ML and eventually AI algorithms and applications.

In February, General Atomics Aeronautical Systems, Inc. (GA-ASI) flew a company-owned MQ-9A Block 5 Remotely Piloted Aircraft (RPA) equipped with a newly developed Centerline Avionics Bay (CAB), shown here. General Atomics Aeronautical Systems, Inc.

CAB capabilities are to become “the catalyst” for the DoD Joint All-Domain Command and Control (JADC2) initiative and to be a part of the Air National Guard’s Ghost Reaper concept in which the MQ-9A is to help correlate multi-source data in contested environments, per GA-ASI.

Under JADC2, all U.S. military service sensors will connect over one network.

GA-ASI is taking “a two-pronged approach to incorporate AI/ML into our platforms, both current and future,” according to Darren Moe, GA-ASI’s senior director of automation, autonomy, and artificial intelligence.

“First, we are implementing our on-board open architecture that enables us to integrate best-of-breed AI/ML mission-critical capabilities,” he wrote in an email. “This includes capabilities such as mission planning, payload control, and target classification. Secondly, we are implementing the necessary flight-critical protections to ensure that our aircraft always flies safely regardless what the mission-critical AI/ML directs. By definition, AI/ML is non-deterministic, and it is paramount that we maintain the high level of safety that our platforms embody today.”

Moe said that GA-ASI predicts AI/ML will expand into multi-aircraft collaborative operations, precision targeting, and fully autonomous operations in communications-denied environments, as the U.S. and allies gain confidence in AI/ML single aircraft ISR missions.

Last November, GA-ASI received a $93.3 million AI/ML smart sensor contract from DoD’s Joint Artificial Intelligence Center, created in 2018. GA-ASI said it is to use smart sensor technology on a variety of company systems on the MQ-9 Reaper, including the Reaper Defense Electronic Support System (RDESS), the Lynx® Synthetic Aperture Radar (SAR), and Metis for directing MQ-9 ISR missions and sharing actionable intelligence updates.

Boeing Phantom Works in St. Louis has been working on using AI/ML to enable autonomous collaboration under the company’s Airpower Teaming System (ATS) effort.

Last December, Boeing finished flight tests of five 11-foot, autonomous drones at the Queensland Flight Test Range in Cloncurry, Australia, and Boeing said that the testing featured data link capabilities that allowed the five planes to communicate at speeds up to 167 miles per hour. The technology is to become part of Boeing ATS and other company autonomous aircraft initiatives.

In February, the Royal Australian Air Force flew a prototype of Boeing’s Loyal Wingman prototype drone, which follows the ATS design and is a candidate for the U.S. Air Force’s Skyborg low-cost attritable demonstrator—one of three Air Force Vanguard programs to speed the fielding of advanced technologies.

“We see AI-powered teaming as a critical enabler in a future where allied customers around the world are all facing a similar challenge – how to generate affordable mass to counter growing adversaries and threats,” Boeing said. “But the challenge is not just matching the pace of technological development, rather leaping ahead to provide operational capability quickly. Allies need systems with the right level of AI so that uncrewed teamed aircraft are capable of adjusting to the realities of the operational area to complete select tasks and missions with minimal crewed input. That is when teaming aircraft can be a true force multiplier for pilots in crewed aircraft, who are serving as the team quarterback and focusing on the big-picture mission.”

ATS “uses artificial intelligence to be that force multiplier, offering a capable and affordable aircraft that can be missionized by each customer and tailored to their unique and specific defense needs, from communications to sensor suites and more,” per Boeing.

The U.S. Air Force may embark on an effort to field cognitive AI for the F-15 fighter’s electronic warfare (EW) suite. In March, the Air Force Lifecycle Management Center (AFLCMC) at Wright-Patterson Air Force Base, Ohio, issued a request for information for the AFLCMC Cognitive EW project.

For now, the effort involves BAE Systems’ Eagle Passive Active Warning Survivability System (EPAWSS).

“The F-15 EPAWSS program is working to evaluate and potentially incorporate small elements of cognitive ability to build flexibility into the system and enable appropriate characterization-of and responses-to various emitters,” AFLCMC wrote in an email. “We recognize that future radars will have the ability to rapidly change and employ waveforms and patterns not previously used, so we are asking for ideas to address that issue. AI will augment our existing systems and help automate processes to analyze unfamiliar signals, determine potential sources, and employ adequate countermeasures when needed.”

While there is no funding in the EPAWSS program for cognitive EW, the RFI may lead to a short study to research cognitive EW for EPAWSS, and AFLCMC may then decide to add cognitive EW to the EPAWSS program in the next several years.

Dan Harrison, BAE Systems’ electronic combat solutions cognitive EW lead, wrote in an email to Avionics International that the company “has been a pioneer in cognitive EW for over 10 years.”

“Through government contracts and internal investment, BAE Systems has developed a cognitive EW capability that allows tactical platforms to sense, understand, and react to dynamic changes in the RF environment with unparalleled speed and efficiency,” he wrote. “These advanced systems understand, characterize, prioritize, and react to changes in the red-force integrated air defense systems in real-time. Additionally, cognitive EW systems bring back relevant observations of the IADS [integrated air defense system] to learn from every mission and continually improve.”

The U.S. Air Force said that a flight of the Lockheed Martin U-2 Dragon reconnaissance aircraft last December 15 marked the first time that AI has commanded a military system, as the AI algorithm helped to navigate the plane and steer the U-2’s Raytheon-built Advanced Synthetic Aperture Radar System-2A (ASARS-2A) to search for simulated enemy missile launchers.

This flight of the U2 with an AI co-pilot marks a major leap forward for national defense as artificial intelligence took flight aboard a military aircraft for the first time in the history of the Department of Defense. (U.S. Air Force)

Booz Allen Hamilton and Air Combat Command’s (ACC) U-2 Federal Laboratory researchers at Beale AFB, Calif. helped develop the U-2’s ARTUµ AI algorithm.

Justin Neroda, a vice president at Booz Allen and a leader in the company’s Strategic Innovation Group that focuses on AI/ML and advanced analytics, said that Booz Allen is trying to advance “not just the algorithms, but how those algorithms are developed, updated, and deployed to aircraft.”

“There are many roles that AI can fill as the ‘AI Copilot,’” per Neroda. “Some of those that we continue to develop include algorithms to support predictive maintenance…We have also developed a range of algorithms to support change detection for humanitarian disaster relief. These algorithms compare historical imagery of an area to evaluate the extent of damage from the baseline to support resource allocation to respond to these natural disasters. In addition, using the improved processes along with AI algorithms more proactive sensor management is executed to make decisions and dynamically replan sensor tasking based on results of on-board algorithms supporting object detection. Traditionally, this would require sensor collection sending the data down to a processing node then getting an answer back to do dynamic re-planning of the mission.”

While the military now applies AI only to narrow uses cases to aid human operators, AI maturity “continues to increase with increased development to broaden the applications that AI can support in the future,” Neroda said. “These abilities and limitations will be continually evaluated to determine expanded decision-making processes they can support.”

previousArtificial Intelligence and Machine Learning Advancing in AviationnextXwing CTO Talks Software Stack for Autonomous Cargo Aircraft