Sunday, December 15, 2024
48.3 F
Peshawar

Where Information Sparks Brilliance

HomeTop StoriesThe AI of war: Computers and autonomous killing - Times of India

The AI of war: Computers and autonomous killing – Times of India



SAN FRANCISCO: Like gunpowder and the atomic bomb, artificial intelligence (AI) has the capacity to revolutionise warfare, analysts say, making human disputes unimaginably different — and a lot more deadly.
Ahead of a summit between China’s Xi Jinping and US President Joe Biden, there had been suggestions that the two men would agree to ban lethal autonomous weapons.
The meeting appeared to produce no such accord, but experts say it’s a vital topic that is already altering armed conflict — and switching up the competition for global supremacy.
Observers say Beijing is massively investing in AI, to the point where it may soon be able to change the balance of power in the Asia-Pacific, and perhaps beyond.
And that has profound implications for a world order that has long been dominated by the United States.
“This is not about the anxiety of no longer being the dominant power in the world; it is about the risks of living in a world in which the Chinese Communist Party becomes the dominant power,” said a report by a panel of experts led by former Google president Eric Schmidt.
Here are some of the possible applications of AI in the art of warfare.
Robots, drones, torpedoes… all kinds of weapons can be transformed into autonomous systems, thanks to sophisticated sensors governed by AI algorithms that allow a computer to “see”.
Autonomy does not mean that a weapon might “wake up in the morning and decide to go and start a war,” said Stuart Russell, professor of computer science at the University of California at Berkeley.
“It’s that they have the capability of locating, selecting and attacking human targets, or targets containing human beings, without human intervention.”
The killer robots of any number of sci-fi dystopias are an obvious example, but perhaps not a very practical one.
“People have been exploring that too, (but) to my mind that one is the least useful,” Russell added.
Most weapons are still in the idea or prototype stages, but Russia’s war in Ukraine has offered a glimpse of their potential.
Remotely piloted drones are not new, but they are becoming increasingly independent and are being used by both sides, sending humans underground to seek refuge.
This could be one of the biggest immediate changes, according to Russell.
“A likely consequence of having autonomous weapons is that basically, being visible anywhere on the battlefield will be a death sentence.”
Autonomous weapons have several potential advantages for an attacking army: they can be more efficient, can probably be produced more cheaply, and they remove tricky human emotions such as fear or anger from battlefield situations.
But these advantages raise ethical questions.
For example, if they are so cheap and easy to make, there is virtually no limit to the firepower an aggressor can employ, Russell said.
“I can simply launch a million of them at once if I want to wipe out an entire city or an entire ethnic group,” he added.
Submarines, boats and planes that are capable of operating autonomously could be a huge boost to reconnaissance, surveillance or logistical support in remote or dangerous environments.
Such vehicles are at the heart of the “Replicator” program launched by the Pentagon to counter China’s enormous manpower advantage.
The objective is to be able to deploy several thousand cheap and easy-to-replace systems in short order, said US Deputy Secretary of Defense Kathleen Hicks, in a variety of areas ranging from maritime to outer-space.
The idea is that with so many “flung into space scores at a time… it becomes impossible to eliminate or degrade them all,” Hicks said.
Many companies are developing and testing autonomous vehicles, like California-based Anduril, which touts an Autonomous Underwater Vehicle “optimized for a variety of defense and commercial mission types” including long-range oceanographic sensing, mine countermeasures, and anti-submarine warfare.
Powered by AI and capable of synthesizing mountains of data collected by satellites, radars, sensors, and intelligence services, tactical software can offer human planners a real edge.
“Everyone within the (Department of Defence) needs to understand that data is actually the ammunition in an AI war,” Alexandr Wang, the boss of Scale AI, told a US Congressional hearing this year.
“We have the largest fleet of military hardware in the world. This fleet generates 22 terabytes of data every day. And so if we can properly set up and instrument this data that’s being generated into pools of AI-ready datasets, then we can create a pretty insurmountable data advantage when it comes to military use of artificial intelligence.”
Scale AI has a contract to deploy a language model on a classified network of a major US Army unit.
Its chatbot — disarmingly named “Donovan” — should allow commanders to plan and act within minutes, rather than weeks, according to the company.





Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

 

Recent Comments