@Holiday20310401,
It's definitely something interesting to think about. Should the technology singularity (
Technological singularity - Wikipedia, the free encyclopedia) arise, computers will be at a level in which they can self-repair, beyond human intelligence. If this happens there's no telling what can actually occur.
This I found interesting, from the article I just posted:
"Berglas (2008) argues that unlike man, a computer based intelligence is not tied to any particular body, which would give it a radically different world view. In particular, a software intelligence would essentially be immortal and so have no need to produce independent children that live on after it dies. It would thus have no evolutionary need for love. But it would have an evolutionary need for power because the first AI that wants to and can dominate the earth will dominate the earth."
I'm just hoping I have my own personal Arnold from the future to protect me because there's a good chance this may occur in my lifetime
I completely agree that we must start taking all this into consideration, and try to apply some type of preemptive measure. However, technology is worldwide, and morality so varied, I highly doubt we could all come to a universal common ground - unless we had to (which would imply we're too late, as we're being destroyed by machines).