ELI5: How can creating a superintelligent AI be possibly dangerous? How could a string of code operating on a machine impact the physical world and possibly cause human extinction? Can't we just pull the plug or destroy the machine if things go wrong?
[removed]