When Tech Tools Turn into Weapons of War
Technology continues to seep deeper into areas where its use is not adequately regulated. With AI and robots in the picture, humanity stands in the new technologically-powered era of warfare. One can’t help but wonder how the brains building technologies feel about their work becoming the go-to-tools for world leaders engaging in endless geopolitical battles.
In a conversation with AIM, roboticist Lerrel Pinto explained the two ways of thinking about the subject. “There’s one which is a philosophical one, and there’s one which is more practical,” he said.
The technology being developed to be neutral in nature is being forced to take up arms. Several states across the globe have consciously exploited defence technology to promote broader economic prosperity. Countries, including the United States, China, the United Kingdom, India, Iran, Israel, South Korea, Russia and Turkey, have invested heavily in developing such lethal weapons in recent years.
A chorus of voices has been raised, especially given the speed at which these weapons are being developed. Experts have long called for regulations to prevent the governments from triggering a chain reaction of escalatory events.
Practically Speaking
Pinto pointed out that in the ongoing Ukraine-Russia war, drones are being used regularly to kill people. Countries that are advancing technologically can be traced back to Chinese breakthroughs. As the socialist economy continues to weaponise its tech that can wreak havoc, the US has felt threatened at certain times, leading it to put sanctions on China and personally make a greater number of investments in the US tech companies.
Since China’s hypersonic missile breakthrough, Russia has been testing its own versions of the technology and is simultaneously investing billions in the field. Putin’s state has been sourcing chips used in phones and computers to fuel its defence arsenal. But they are not the only ones. The recent case is of Israel announcing to invest a $3.2 billion grant to the chipmaker company Intel, amid the ongoing military action in Gaza.
“These drones are controlled by a human operator somewhere, but it’s still a robot. It has a basic level of AI and controllers on the drone to keep it stable. If I ask it to move left, it does. These types of algorithms are already out there at the same time,” said the Assistant Professor of Computer Science at NYU Courant.
He further mentioned that the regulations on using AI or robots in the military are lacking in many countries. But every country wants to have as advanced an army as possible to safeguard their interests. “If other countries are using robots to gain an advantage, you also have to create a robot military to nullify that advantage,” he added.
Pinto then pulls out the classic nuclear weapon analogy. “One uses nuclear weapons for war, and that forces all the developed countries to also have a similar weapon. It’s not that they would use it because then you’d have mutually assured destruction. It’s like if your opponent has an advantage, you also want to have that same advantage,” he explained.
He believes that eventually, many governments will have autonomous robots since they highly fund the research and find it very interesting. “That’s a practical thing”, he said. “It’s going to be used by one country, and others will be forced to use it even though they may have a different ethical or moral standard.”
Philosophically Thinking
A lot of researchers do not like their technology being used for violence, non-ethical or non-moral reasons, including Pinto. “In that sense, we make a conscious choice not to work with certain organisations that want to commercialise our technology in warfare,” he said.
“At the same time, all of our technology is open source, so someone else can figure out how to use it. But at least we are not actively speeding up that process,” the researcher noted. Alongside the group of researchers working on Computational Intelligence, Vision, and Robotics (CILVR), Pinto introduced Dobb-e, an open-source, general framework for robots to learn household manipulation.
From an AI standpoint, he doesn’t think the robots are there yet where they can be fully autonomous. They still need to have human operators at the backend, he highlighted. “If it enables a soldier to perform it is better than having a robotic mind of its own indiscriminately going into neighbouring villages and creating violence,” Pinto concluded.
The post When Tech Tools Turn into Weapons of War appeared first on Analytics India Magazine.




