海角大神

海角大神 / Text

Hypersonic missiles may be unstoppable. Is society ready?

Hypersonic missiles raise ethical questions about how the military could use machine learning, as the U.S., Russia, and China develop new weapons.

By Kristina Lindborg , Contributor

Humans love things that go fast: race cars, speedboats, and cheetahs. Then there鈥檚 hypersonic, which leaves plain old fast blinking in the dust.

On March 19,聽the United States launched its first successful hypersonic test missile from a naval base in Kauai, Hawaii. The unarmed missile tore through the idyllic Pacific skies at Mach 5, five times faster than the speed of sound.聽

The Pentagon aims to test a full hypersonic weapon system by 2023, seeking to draw level with Russia and China, which have touted their own development of hypersonic weapons technology and say they have the hardware to prove it.

Hypersonic missiles are not just very fast, they are maneuverable and stealthy. This combination of speed and furtiveness means they can surprise an adversary in ways that conventional missiles cannot, while also evading radar detection. And they have injected an additional level of risk and ambiguity into what was already an accelerating arms race between nuclear-armed rivals.聽聽

To understand why, consider the falcon and the albatross.聽

The Peregrine falcon is the fastest animal on Earth. From a cruise altitude of more than 3,000 feet, it drills down through the air at 200 mph to snag its prey. Fast.

The sea-faring albatross can soar effortlessly for thousands of miles without a flap of its massive wings, hugging the water鈥檚 surface until it abruptly leans into the wind to gain altitude so it can alter its course and swoop down once more. And it does this again and again and again. Maneuverable.

If you could put the two birds together you would have one formidable bird of prey: fast and maneuverable. And that is what hypersonic weapon systems developers have done. There will soon be the capability to launch weapons 鈥 conventional or nuclear 鈥 that travel faster than the speed of sound and can change course unpredictably and very quickly, making them much harder to track and intercept.

鈥淭hey (our adversaries) have systems that try to deny our domain dominance,鈥 Mike White, assistant director for hypersonics in the Office of the Under Secretary of Defense for Research and Engineering, told a recent Pentagon press conference. 鈥淚t really is those threats and those targets that鈥檚 driving our investment in hypersonic strike capabilities.鈥 The Defense Department has requested $3.2 billion for hypersonic-related research in the 2021 fiscal budget.聽聽

Crucial role for AI

As the hypersonics race heats up, a long stream of legal, ethical, and diplomatic questions trail in its swift wake, particularly about the critical role that artificial intelligence (AI) plays. This is because of how these systems work together to deliver hypersonic missiles precisely, in theory, to any point on the globe.

Unlike conventional missiles, these rockets don鈥檛 follow a predictable trajectory. Only complex AI-based sensor systems are capable of detecting and intercepting them. And as the demands on the weaponry grow, so do the concerns about how much humans will have to rely on the AI鈥檚 set of ethics 鈥 trained by the developer 鈥 that inform the system鈥檚 鈥渕oral鈥 choices.

One of these concerns is the ambiguity factor. In other words, what kind of warhead might be on that incoming hypersonic weapon: nuclear or conventional?

Added to that uncertainty is the short response time, which means that 鈥淎I or certainly at the very least automation comes into play鈥 in defending against such missiles, says Douglas Barrie, a senior fellow for military aerospace at the International Institute for Strategic Studies in London.

Even if it鈥檚 not all dictated by AI and machine-based learning, he says, 鈥渢here is going to be an awful lot of automation and that kind of decision chain to deal with these kinds of systems.鈥

Since AI plays a pivotal role in sensor detection systems, it is central to any ethical debate. Right now, the likelihood of a completely autonomous response to an incoming hypersonic missile seems remote. But that could change, say analysts. For an incoming conventional missile, military commanders may have 30 minutes to detect and respond; a hypersonic missile could arrive at that same destination in 10 minutes or less, forcing a decision faster than seems possible without AI.聽

鈥淭hose time frames have sped up which would put more pressure on a person,鈥 says Dr. Gordon Cooke, director of research and strategy at West Point. As a society, he adds, we should be thinking about how to view these systems in the future. 鈥淲hat kind of society do we want to have, especially in regards to warfare?鈥

鈥楾echnology will always fail鈥

And of course, the question of warfare itself raises its own massive ethical quandary. Some, like ethicist Patrick Lin, view it as a social problem, not a technological one.

鈥淭echnology will always fail,鈥 says Dr. Lin, a professor of philosophy at California Polytechnic State University in San Luis Obispo. 鈥淭hat is the nature of technology.鈥 Ethical guidelines, he adds, should be built within the design of the architecture itself so that they鈥檙e integral to the system, not an item added on later.

The Sandia National Laboratories is home to Autonomy New Mexico, a network of academics working on autonomy research for national security missions, including hypersonic weapons. What they try to do, says a senior manager, is to make sure that AI systems help the human operators know what the best courses of action to take are in fast-paced scenarios. As to the ethics used, 鈥渋t鈥檚 up to the Department of Defense to come up with how to employ that,鈥 the manager says.

The Pentagon recently released its own ethical guidelines regarding AI research. Its five headings are: Responsible, Equitable, Traceable, Reliable, and Governable. The last makes clear that humans must be able to override an intended AI decision. AI should keep humans 鈥渋n the loop鈥 by 鈥減ossessing the ability ... to disengage or deactivate deployed systems that demonstrate unintended behavior,鈥 the guidelines read.

Not just weapons

Dan DeLaurentis, an aeronautics professor who directs the Institute for Global Security and Defense Innovation at Purdue University in Indiana, says AI plays an integral part in the hypersonic system by helping militaries to piece together lots of information about incoming missiles. But any concerns about robots running amok are unfounded, he adds. The Pentagon isn鈥檛 out to create 鈥渢he unsupervised automated killing machines鈥 of Hollywood dystopias.

Then there are thorny questions about the potential weaponization of space, such as the absence of binding treaties to ensure some measure of stability. What arms treaties currently exist between the U.S. and Russia don鈥檛 cover hypersonic missiles; legal experts say the U.S. may propose adding them in the future.

鈥淚f we were in a situation where the arms control community and a number of treaties were still in place, this would be less troublesome than it is,鈥 says Mr. Barrie.聽

Dr. Lin argues that the benefits of hypersonic weapons compared to the risk they create are 鈥渨idely unclear,鈥 as well as the benefits of the AI systems that inform them.

鈥淚 think it鈥檚 important to remember that diplomacy works and policy solutions work. ... I think another tool in our toolbox isn鈥檛 just to invest in more weapons, but it鈥檚 also to invest in diplomacy to develop community.鈥澛