The CBS show “60 Minutes” recently featured the rising number of experimental Pentagon technologies operating with artificial intelligence, from drone swarms to ground robots to naval ships.
The segment, “The Coming Swarm,” showcased a ground robot-aerial drone duo designed to track terrorists, a naval trimaran capable of spotting submarines, and more than 100 drones dropped from a trio of F/A-18 Hornet jets flying at near the speed of sound in what was billed as the largest micro-drone swarm.
The latter, which took place in the fall at China Lake, California, was arguably the most noteworthy, not only because the systems demonstrated collective decision-making and adaptive formation flying, but also because of their high-pitched alien sounding scream.
“To me the eeriest part about this moment was actually the sound,” correspondent David Martin later said of the noise. “It turned into something almost from another planet when you heard all 100 of them slowly descending in that sort of death spiral.”
When Martin asked William Roper, director of the Pentagon’s Strategic Capabilities Office, whether autonomy is the biggest thing in military technology since nuclear weapons, Roper replied, “If what we mean is the biggest thing that is going to change everything, I think autonomy is going to change everything.”
He’s not alone. Last week in Washington, D.C., officials from the Defense Advanced Research Projects Agency, the Pentagon’s research arm known as DARPA, the defense industry and other organizations met to talk about the opportunities and challenges for artificial intelligence in the military.
The “Beyond A.I. Forum” was organized by Tandem NSI, a national security consultancy based in Arlington, Virginia, and Booz Allen Hamilton, the defense consulting giant based in McLean, Virginia.
While autonomy, artificial intelligence and machine learning are exciting and rapidly evolving fields, defense officials have to “take a step back and say, ‘Our competitive advantage in this rising-tide-is-lifting-all boats is what?” said Chuck Howell, chief engineer of portfolio programs and integration at Mitre Corp., an engineering nonprofit that supports the federal government.
“What is the [concept of operations]?” he asked. “What are the confidence levels? What are the ways to exploit this global capability that we can come up with that’s novel?”
Howell added, “There are huge opportunities for companies that can take the general framework of A.I. machine learning and tailor it to those weird examples that the DoD and the [intelligence community] worry about. Finding cats on the Internet? Not a problem. Finding tells in a grainy overhead http://defensetech.org/wp-content/uploads/2017/01/perdix-drone-swarm-777×437.jpg? Harder.”
Justin Manzo, senior lead engineer at Booz Allen Hamilton, agreed. For the Pentagon, big data is part of the problem. Developing systems that can help identify the megabytes of critical intelligence from the petabytes of information is part of the solution, he said.
“Those kind of systems are what we can operationalize … [and] put downrange, where there’s limited data links,” he said.
Jonathan Aberman, managing director of TandemNSI and moderator of the panel, said the business opportunities for developing products and services in this space are significant. The Defense Department is estimated to spend upwards of $3 billion a year on autonomous systems alone.
“If you’re an entrepreneur … the next two years [represent] unbelievable opportunities for raising venture capital around these products,” he said.
Fred Kennedy, deputy director of the Tactical Technology Office at DARPA, made clear the Pentagon’s goal for the technology. “These are all systems we’re looking at right now,” he said. “Autonomy is going to be our asymmetric approach to how we fight.”