The whole artificial intelligence and the dangers of it becoming self-aware have been done wuite a bit. Ringworld. Frank Herbert has several that deal with this. Some of it revolves around the definition of life -- self-awareness, self-preservation, reproduction... What happens when what is created becomes greater (more powerful) than the creator?
In one of Frank Herberts books, the set-up is a space ship whose mission is to create artificial intelligence. In the end, they are successful, and a one-word command -- "you" -- sets in motion the "god-hood" of the machine, when it's response is "you will now determine how you will worship me."
Then there's the whole time-line continuity thing. The terminator going back in time to defend John Connor -- does that then make the future with the machines a certainty? If they then destroy the possibility of the machines existing, how does the machine then come back to make that reality? Juxtapose that with the "there's no fate but what we make for ourselves" thing, and you have a real dilemma.
Just some food for thought.