Gray Eagle drones were armed with HELLFIRE missiles and GBU-69 glide bombs, 155mm artillery weapons fired rounds 60km (37.3 miles) to destroy SA-22 enemy air defenses and armored ground combat vehicles directly hit multiple T-72 tanks during the Army’s Project Convergence 2020 at Yuma Proving Grounds, Ariz.
The real story, however, according to senior Army leaders attending the service’s transformational combat experiment, was about data sharing, networked targeting and a cutting edge AI system called FIRESTORM.
“The bullet flying through the air and exploding is interesting, but that is not what is compelling about Project Convergence. It is everything that happens before the trigger is pulled. We did not come out here for a precision-fires exercise, what we came out here to do is increase the speed of information between sensing the target and passing that information to the effector,” Brig. Gen. Ross Coffman, Director, Next Generation Combat Vehicles Cross Functional Team, Army Futures Command, told reporters Sept. 23 at Yuma Proving Grounds.
FIRESTORM uses advanced computer algorithms to gather radio data link feeds, video stream data, navigational and terrain specifics, weather conditions, target coordinates and precisely identified enemy location information. FIRESTORM then uses AI-enabled computer processing to perform near real-time data analytics and compare all of these variables against a vast or seemingly limitless database. The various information streams are pooled together and analyzed in relation to one another to organize the data and identify the optimal weapon or “effector” needed for that particular target.
“FIRESTORM is a computer brain that recommends the best shooter, updates the common operating picture and enemy and friendly situations. It ‘missions’ the effectors that we want to eradicate the enemy on the battlefield. As enemy targets were identified on the battlefield, FIRESTORM quickly paired those targets with the best shooter in position to put effects on this,” Coffman said.
FIRESTORM can in part arrive at analytical conclusions in a mere instant, by weighing new information against previously compiled information. Machine learning happens when AI-enabled databases immediately assimilate new information that is entirely different than what is in the database. The pace at which this new information is discerned, analyzed and integrated comprises the fundamental value-added quality of AI.
Perhaps certain weapons such as artillery were proven effective for a certain range and target composition in particular weather conditions, at particular altitudes, with particular defenses and terrain configuration? The computer will analyze all of these variables both individually and in relation to one another against its database and pair the right weapon for the particular target engagement. This entire process can now take place in seconds, representing an exponential leap beyond previously achieved benchmarks of roughly 20minutes.
“This is happening faster than any human could execute,” Coffman said.
However, the system must be adaptable to new enemy threats. Once enemies encounter certain systems, they of course immediately move to counter them, therefore requiring developers to expedite quick improvements.
“We need code writers who will need to change algorithms to adjust to new threats. We can’t wait 24 hours; we will have to change instantaneously to targets. We need to make decisions at speed and get ahead of the enemies’ decision cycle,” Lt. Gen. Michael Flynn, Director Army G3/5/7 told reporters.
Flynn further explained this “need for speed” in the context of the well-known Processing, Exploitation and Dissemination (PED) process that gathers information, distills and organizes it before sending carefully determined data to decision-makers. The entire process, long underway for processing things like drone video feeds for years, has now been condensed into a matter of seconds, in part due to AI platforms like FIRESTORM. Advanced algorithms can, for instance, autonomously sort through and observe hours of live video feeds, identify moments of potential significance to human controllers and properly send or transmit the often time-sensitive information.
“In the early days we were doing PED away from the front lines, now it’s happening at the tactical edge. Now we need writers to change the algorithms,” Flynn explained.
“Three years ago it was books and think tanks talking about AI. We did it today,” McCarthy said.