Dave, We Need To Bomb More, Dave
from the conundrums dept
With all the talk lately about creating fighting robots and other autonomous military equipment, the NY Times is looking at questions raised by autonomous fighting machines and whether or not it eventually reaches the classic science fiction premise where the machines take over. While these fighting machines are still supposed to be under human control, the article points out that the whole purpose of automation is that it shouldn't need human control, and if the humans that program it make mistakes then the fighting machines are likely to make mistakes as well. Another, more interesting, question is that even when humans do have the final word, do they become complacent in the face of computer generated information? If it's right most of the time, they tend to trust it, over their own instincts and are likely to miss the small percentage of times where a human is likely to make a better decision than the machine.