The trend toward digital, programmable radio frequency (RF) equipment — epitomized by software-defined radio — means that radars can quickly change waveforms, creating unique signatures on the fly. In the increasingly congested and contested RF environment, hostile emitters become harder to locate, identify, jam and confuse. Hence today’s focus on machine learning applied to electronic warfare (EW) — or cognitive EW.
An important step along that path is improved spectrum awareness, one of the aims of the United States Defense Advanced Research Projects Agency’s (DARPA) RF Machine Learning Systems program. The program will lay the groundwork for “a new generation of RF systems that are goal-driven and can learn from data,” according to DARPA. It is one of multiple programs that address the RF/machine learning nexus. A contract for the program was recently awarded to BAE Systems, Expedition Technologies, Northeastern University, Teledyne Technologies and SRI.
When you can create “myriads of signals at any frequency in the RF spectrum,” it’s important to ask “what radio signals are actually occupying the set of frequencies in my immediate vicinity,” said the program’s manager, Paul Tilghman. The program is a “foundational” effort, Tilghman said. It’s building a technology base that would answer lots of questions, among which are how to improve EW and radar systems.
How to better understand the RF signal environment is the program’s “broad, high-level question,” he said. To get there, “to make sense of the spectrum data,” DARPA plans to develop fundamental algorithms and techniques that apply machine learning to the RF spectrum.
At a high level, DARPA is pursuing RF signal awareness as a means to expanding the capacity of the finite spectrum resource through improved spectrum sharing. “Systems trying to access the same block of spectrum at the same time, for example, might be able to negotiate over the time sequence,” said Chris Rappa, product line director for RF, EW and advanced electronics with BAE Systems’ FAST Labs research and development organization. Systems use spectrum to communicate, navigate, position, surveil and sense. “EW is only a subset of that spectrum negotiation piece,” said Rappa.
Spectrum awareness is also important as more radios, communications systems, radars, jammers and many other applications, including internet-of-things devices, operate in the spectrum and as hostile emitters become more clever at camouflaging their signatures to look like “white force” or neutral emitters. EW systems need to be able to infer the intent, friendly or not, of others sharing the spectrum.
Machine Learning Test Run
DARPA has done some initial studies on “sample problem sets that are somewhat more simple in nature,” Tilghman said. In one effort, researchers built a convolutional neural network to understand what modulation a signal was using — AM, FM or phase-shift keying, for instance.
Those studies showed that the machine-learning system outperformed traditional approaches at every signal-to-noise ratio, Tilghman said. So even though that problem was relatively small in scope, it provided “enough evidence to go, ‘Oh, Wow!’” Tilghman added it proved that machine-learning systems “can abstract additional features and information out of the RF spectrum to help us better understand [the signal environment].”
As Google AI proved with the game of Go, “AI can tackle making decisions in really large combinatorial spaces,” he said. He hopes to use machine learning not only to process spectrum data once it’s collected, but also to “help us decide what spectrum data we will acquire with our RF sensors in the first place” by answering questions such as what spectrum to look at and capture, and when and where to look for it.
A cognitive system is capable of real-time learning. “It is thinking,” Rappa said. It will ask itself, “Where should I be looking? How should I respond?” A cognitive system could change what it looks for or what it transmits based on what it has experienced. “You are looking at actually changing … the features that you’re looking at to classify a signal,” based on what the system is learning.
This decision-making capability would be a major advance over traditional RF systems, in which frequencies and spatial directions are often scanned in a sequential order irrespective of the statistics about the operating environment, according to DARPA. These systems have little understanding of what’s happening in the spectrum, say, which signals are unusual in an area or frequency band. The program hopes to detect unexpected signals.
RF systems today use rules-based reasoning akin to first-generation AI expert systems. The vast majority of fielded electronic support measures (ESM) systems, for example, use lookup tables, said John Thompson, naval aviation campaign director with Northrop Grumman’s Mission Systems Sector. Data comes into the airframe and is sorted by spreadsheet-like software that associates the incoming signal with the proper response. But the increasing digitization of radar capabilities is driving the need for adaptive and cognitive EW, he said.
“Military forces can no longer rely solely on predefined threat databases to detect, identify, locate and react in a timely manner because today’s technology enables threats to change their waveforms through software without requiring any hardware retrofit,” he said.
Today’s systems essentially institute a set of rules, based on the intelligence community’s analysis of a signal, said Dan Kilfoyle, technical director for electronic warfare systems with Raytheon Space and Airborne Systems. The system’s logic is, “If you see this, then do that.” Given the speed with which new software-driven signatures can be devised, however, there is much less time for traditional, hands-on analysis.
Cognitive systems will be key to operational success, Thompson predicted. These systems will be able to change their RF transmissions beyond their baseline programming in response to unknown received signals. By contrast, “agile” systems, which perform functions such as switching frequencies in predetermined sequences, and “adaptive” systems, which modify their responses based on changes sensed in the environment, operate within the constraints of their software programs.
A cognitive system will go “beyond writing its own software,” Thompson said, alluding to its ability to think outside the programming box. The more data the algorithms sample, the higher the probability that their inferences are correct.
Existing Systems Adapt
Northrop Grumman has some irons in the fire. Under the U.S. Navy’s Reactive Electronic Attack Measures (REAM) program, Northrop is developing machine-learning algorithms for the EA-18G Growler airborne electronic attack (EA) suite. Targeted for fleet transition around 2025, the program would bolster EW capabilities against agile and adaptive unknown or hostile radars.
An extension of Northrop’s Growler work is an unmanned air vehicle (UAV) swarm concept code-named Remedy. These expendable drones, deployed in canisters from the aircraft, would function as close-in sensors, Thompson said, and would provide more data for the “cognitive stew.” The lessons from the program “will be applied to the Remedy UAVs.” The company is pursuing RF and infrared sensors for these vehicles to provide multispectral situational awareness.
Raytheon plans to “create the infrastructure in our products that will be able to quickly adopt, implement, and use the best-of-breed algorithms,” Kilfoyle said. The company’s all-digital radar warning receiver, the AN/ALR-69A(V), which has demonstrated geolocation capability, is an example of this approach. “We’ve given it the right RF components … and we’ve given it all the infrastructure that is required to implement things like a complex neural net and be able to exercise it,” he said. The AN/ALR-69A(V) is installed on the C-130H and KC-46A and is being tested on the F-16.
“It’s hard to look at the whole electromagnetic spectrum and do a good job of it,” Kilfoyle said. When threats are moving in frequency, the cat-and-mouse game gets more difficult, particularly if threats’ frequency shifts are on a bigger scale than the frequency windows RF systems are optimized to see. Frequency agility drives the needs wideband, agile and intelligent EW systems.
Future RF systems also will need to sample faster and quickly understand different dialects or waveforms, Thompson said. But all the agile, adaptive and cognitive technologies (points along an evolutionary curve) will be additive to today’s EW systems, which will continue to be effective against many threats, he said. “We don’t want to get rid of all the easy answers,” he added.
A machine-learning-based system could learn and react in ways that experts would never have thought, Rappa said. “That’s where we get the real value.” Machine learning might be able to discover hitherto unimagined distinctions between emitters based on “unintentional modulations” of waveforms caused by factors such as manufacturing flaws. This data might be thrown away by an expert-system-based device.
Increased spectrum awareness thus could help identify “RF transmitters that may be lying about their identification,” Tilghman said. RF fingerprinting, one of the technical areas of the RFMLS program, may “allow us to identify an RF transmitter regardless of what it purports … to be.”
Fingerprinting tries to identify a transmitter by its unique signal features. The analog electronics that condition the signal and eventually amplify it impart very subtle but unique features to the signal, Tilghman said, just as the way a person speaks adds “metadata” above and beyond the information the person is consciously conveying.
DARPA is looking at solid state technology, which has “much more subtle RF fingerprints” than the older tube-based RF systems, he said. Perhaps emitters could be distinguished based on temperature variability, Rappa said. Perhaps nuances could be detectable based on whether the emitter is in bright sunlight or in shade.
Sometimes a red force emitter will try to disguise itself as a white force emitter or neutral on the battlefield, so that the signals are nearly identical, Thompson said. To tell the white emitter from the red look-alike, EW systems need not only intelligence but higher fidelity.
Cognitive RF is still more of a hardware challenge, Rappa said. More investment is going into areas such as semiconductor materials and processing devices. And more investment will be required.
One recent advance he cited is Xilinx’s RF system on a chip, which integrates wideband analog-to-digital conversion and digital-to-analog conversion, multiprocessor CPUs and field programmable gate array capability on a monolithic chip. As front-end hardware consolidates, techniques like ML will be applied farther forward in the processing chain.
The advent of wideband, direct RF conversion supports frequency agility. It means you can go from, say, 1 GHz to 2 GHz “within a few samples,” said Pete Thompson, VP of product management at Abaco Systems. Although analog circuitry is still required to amplify, filter and convert the signal to digital, modern systems don’t need as much of it, Thompson said. Modern systems allow almost instantaneous, discontinuous frequency shifts.
Abaco Systems’ new VP430 3U VPX board was the first product to incorporate the Xilinx RF SoC, Thompson said. Abaco is providing customers with early engineering silicon in air-cooled, lab-grade hardware, with plans to migrate to rugged, tactical, conduction-cooled units.
Once you go digital, EW/RF applications become more of a data management problem — a software or digital signal processing issue, Rappa said. “Once you get everything digital — and you can process the data at the rates you need to keep up with it — RF systems can be a lot more flexible. If you could change the entire receive path or the entire transmit path by changing memory registers or a firmware load, it would be a game-changer.”
Believe It or Not?
Kilfoyle expects that in the early days pilots will “need to learn to trust that the system is going to be able to do a better job.” Many machine-learning applications today are like a black box, he said. You give them the inputs and they tell you what to do. They don’t say why, which makes it a little hard to trust them. Perhaps systems will have to come with a knob to “dial up or down the level of cognition.”
BAE’s Rappa said smart systems’ thought processes could be deduced even though their reasoning is opaque. BAE engineers will still be able to look into the back-end and see the parameters that the system used. “We can interrogate the system and, to an extent, ask it questions to see why it’s making certain decisions,” he said.
“If the system is learning in real time, it may take post-processing to understand its thought process,” he added. You can see what state a system was in and what data was in its memory when it took an action. Data logging and maintaining mission records becomes increasingly important, he said.
But clarity in hindsight doesn’t help the pilot in the cockpit, whose role traditionally has been codified in the “Observe, Orient, Decide and Act,” or “OODA Loop.” In the future, Rappa contended, “we need … to get humans out of the OODA loop,” so that systems can observe, decide and react faster than their adversaries do.
But humans will always be in the loop in electronic attack scenarios, Northrop Grumman’s Thompson predicted. “They will have to sign off on a machine’s request either in real time or by pre-arrangement if a certain probability of positive ID is achieved.” That’s the way automatic target recognition works today, and that’s the way cognitive EW will work tomorrow. “I’m a big fan of people,” he said.
Kilfoyle said there will always be cases where pilots can deal with threats. Cognitive EW will always be a bit of an “exquisite tool” that you can’t use on everything. But the role of humans will diminish as the timeline between a threat’s seeing you to shooting at you becomes more and more compressed. Machine-learning approaches range from deterministic (when a system follows if-then rules) to probabilistic (when it must evaluate whether an uncertain pattern matches a rule), he explained.
Training, Testing Challenges
“One of the primary challenges of cognitive EW is being able to provide systems with the right amount and right quality of data” on which to train, Kilfoyle said. A smart EW system doesn’t need to figure out how to drive through New York City on its first trip, but it does need to answer a few basic questions like: Is this a threat? Is this like something I’ve seen before? Or, as Northrop Grumman’s Thompson observed, is it both a threat and a non-threat? For example, is it an air traffic control radar serving both hostile military and commercial aircraft?
Although there is a lot of data based on prior observations, Kilfoyle expects that simulations will play an important role in developing cognitive systems. “Perhaps we can engineer scenarios where we can learn things and not be at risk,” he said.
In some ways, however, developing cognitive EW is far more difficult than developing driverless cars. The latter can harvest limitless data, but “there is not a giant repository of data on unknown radars,” Rappa said.
Pertinent signal data is typically low in signal quality, not labeled and not timely. The data is “very sparse and spartan,” which drives the need for custom applications. Because of the paucity of data, more modeling and simulation are required, which are challenging and expensive processes. Commercially derived approaches to machine learning don’t work well in EW, he said. AVS