Advanced technologies like robotics and artificial intelligence will soon rapidly change the character of war, and because every country will have access to these technologies, the U.S. must be prepared, according to Army Gen. Mark Milley, the former chairman of the Joint Chiefs of Staff.
Milley shared these predictions in a recent interview for 60 Minutes while sitting aboard the USS Constitution, the oldest naval warship still afloat.
"Our military is going to have to change if we are going to continue to be superior to every other military on Earth," Milley told 60 Minutes correspondent Norah O'Donnell.
Milley said artificial intelligence will speed up and automate the so-called OODA loop — observe, orient, direct, and act — which is the decision cycle meant to outwit an adversary. More than two centuries ago, this strategy looked like Napoleon getting up in the middle of the night to issue orders before the British woke up for tea, Milley explained. Soon, it will be computers automatically analyzing information to help make decisions of where to move troops and when.
"Artificial intelligence is extremely powerful," Milley said. "It's coming at us. I suspect it will be probably optimized for command and control of military operations within maybe ten to 15 years, max."
For now, the Department of Defense standard is for all decision-making to have a human OODA loop, and department guidelines say fully autonomous weapons systems must "allow commanders and operators to exercise appropriate levels of human judgment over the use of force."
According to Deputy Secretary of Defense Kathleen Hicks, that standard will apply to a new DoD program called "Replicator," a Pentagon initiative aimed at countering the size of China's military. The program aims to produce thousands of autonomous weapons systems powered by artificial intelligence.
"Our policy for autonomy in weapon systems is clear and well-established: There is always a human responsible for the use of force. Full stop," Hicks said last month. "Anything we do through this initiative, or any other, must and will adhere to that policy."
The International Committee of the Red Cross says autonomous weapons — including those that use AI — could lead to unintended consequences, like civilian casualties or an escalation of conflict.
But will AI make war more likely?
"It could. It actually could," Milley said. "Artificial intelligence has a huge amount of legal, ethical, and moral implications that we're just beginning to start to come to grips with."
The video above was produced by Brit McCandless Farmer and Will Croxton. It was edited by Will Croxton.
for more features.