When I saw the ending (SPOILER ALERT) I was shocked. Not at the fact that some of them survived, but after that, during the scene’s where Head Six and Head Baltar were seen on Earth 150,000 later debating whether or not the human race (the ones that they started anew) would repeat the same mistakes. It became really eerie when they showed robots that were in development and how Human they looked and it just begs the question. Will we suffer the same fate as the twelve colonies?
Think about it. There are dozens of pop-culture movies about the destruction of humanity by the means of robots. There are reasons. Robots are a relatively new concept. We use them now, but not to the extent that they could be used. Imagine a world where “Terminator’s” do rebel. Terminators = Cylons. If that is a stretch, and you think that this is just a modern concept, then look back a few years before then.
“I, Robot” was written back in the fifties and it revolutionized the way people imagined robots. It also came up with the “Three Laws”. I would like to believe that if the three laws are followed, then we might be able to prevent anything like this. (And with all due respect to Ronald D Moore, the Excutive Producer of the re-imaged Battlestar Galacitca, I think he made a glaring lack of use of the three laws.) (Also, there is some debate as to whether the three laws would be effective in today's world, and I never really want to find out.)
What I am saying, though, is that although the ending of Battlestar Galactica implies that humanity is based off of their story, I do think that we are headed towards a day when Robots will be taking a greater responsibility in regards to what they handle on a day to day basis. In order not to end up like the John Connor and friends or Admiral Adama, or even worse to end up in a Grey Goo scenerio, we need to cover our basis and decide, "How much or what do we want computers and Robots to control?"
Episode 153: Hot As Balls
-
Episode 153: Hot As Balls – In the post-E3 show, we recap one of the most
boring E3’s in the history of MAHG. We then talk about the wonderful
Wonder Woma...
8 years ago
Where the 3 laws fall apart in the concept of Terminator and Skynet is if AI evolves to the point of having free will. I guess the question is, can an AI learn and adapt enough to change its own programming? I wonder. By design, a computer AI already knows more about how its own internals work than we know about ourselves, right?
ReplyDeleteThe Three Laws aren't really "laws" per se, so why would RDM have to hold to them? Would they be good ideas, yeah, but there's no guarantee humans that invented AI would feel the need or be compelled to program the 3 laws into them.
ReplyDeleteThis comment has been removed by the author.
ReplyDelete