Technologically, self-operative (killer robots) weapons have less difficulty than autonomous cars,” UC Berkeley professor of computer science Stuart Russell told Vox. “People working on similar technology products think it would be fairly easy to design a very efficient weapon for two years.”
Autonomous killer robot
These are some of the uncomfortable arguments cited in a new Vox publication about autonomous weapons. This system is powered by artificial intelligence that could decide autonomously to use deadly tools without any human interference.
The public may not be familiar about how near we’re to killer robots because they do not know how technology would be, said University of New South Wales expert of artificial intelligence Toby Walsh to Vox.
“When people know about ‘killer robots,’ the Terminator comes to their minds. They assume it as a science fiction thing.
They imagine something that is not possible to build by many years,” Walsh said. “Instead, they are simpler technologies, much closer and are being tested as a prototype.”
Vox’s subject suggests that these “simpler technologies” may be enhanced in shape of drones with face detection technique to target only identified targets and that’s a scary aspect given the many imprecision examples of this technology.
What could go wrong?
Fully self-ruling weapons will become cheaper and easier to kill many people. A big problem may arise if it reaches the wrong hands.
But people that do not support lethal autonomous weapons notify of danger that the end results could be far worse than that.
there are several benefits and risk of artificial intelligence. however, we need to see what ethical robot we can make and utilize them.