A recent meeting of experts in Switzerland discussed the growing concern about killer robots.
Experts at the World Economic Forum in Davos, Switzerland, have expressed a great deal of concern that we are creating “killer robots,” as we reported recently.
As technology advances, experts are becoming increasingly concerned about autonomous machines that could become a threat to humans. Some even envision a Terminator-like future. But just how likely is this? Some aren’t so convinced.
For one thing, the future use of technology is notoriously hard to predict. Back to the Future II’s vision of 2015 obviously didn’t quite match reality. And there are many other instances where mankind’s idea of the future decades down the road is way off.
A report from Re/code makes the case that there are a few misconceptions with the idea of a Terminator future.
One misconception, Re/code argues, is that intelligent machines will one day be capable of self-replicating. After all, viruses and bacteria can do it, why not super advanced robots? The problem is that intelligence on its own wouldn’t lead to a self-replicating entity — that would have to be programmed.
That leads to a second misconception, that intelligent machines will have the same motivations as humans, seeking to self-replicate to preserve itself. But the desire to self-replicate is something specifically developed in life forms. Robots wouldn’t have that same instinct for survival built into the human brain.
A third misconception is that machines are smarter than humans or will become smarter, and they will become vastly more intelligent with time.
But in order to grow in intelligence, machines would need to go through the same process of learning and discovery as the human brain. It can’t do that by simply acquiring knowledge, it must go through the process itself, the report argues.