Page 4 of 15

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 1:34 am
by Sparky Prime
JediTricks wrote:Just because the Exocomps did the same thing doesn't mean they had the right to, had they not come up with a better plan to save Picard and LaForge they'd have suffered the consequences for those actions. Hence they didn't assert the right to protect themselves by killing others. They especially didn't assert that right over the entirety of all living things.
Why wouldn't they have the right to chose between sacrificing themselves, or saving themselves? What consequences would they face for choosing not to get themselves killed? As sentient life, they would have that right.
I would argue that no life has unlimited moral right to self protection, there are limits. I would further argue that we as a society hold final say over what lifeforms enjoy what level of limitations on those rights, we do so every time we do animal testing or eat carnivorously or kill an enemy in battle.
You seem to be getting away from the point, which was that a sentient life form has a moral right to defend itself, even over the people who built it. I agree, there are limits to that, I never said there wasn't. I have already agreed Skynet crossed the line, but that doesn't mean it didn't have some moral rights as a sentient form of life. Even for lab animals, animals used for food, or enemies in a battle, there are laws that gives them rights.
I absolutely am looking at the other side of the story, but if they had let Picard and LaForge die to protect themselves they would have not continue to enjoy any level of rights at all, they would have faced severe consequences - sentient or not.
Why would they face any consequences? If the Exocomps were in fact sentient, then the crew had no right to force them to sacrifice themselves with out a choice. There is no justification to punish someone for choosing not to get themselves killed to save someone else.
Ok, I used hyperbole, there ya have it. He still killed his brother, he said goodbye and made it impossible for Lore to reactivate himself. And what about his sitting judgement over Kivas Fajo?
I'd still argue Data did not kill Lore, if only on a technicality that he's an android that could be reactivated.

And what about Kivas Fajo? He was an admitted thief and murderer who told Data he wouldn't stop. On top of that, he was trying to force Data into giving up his own rights as a sentient being. Data had the right to defend his own rights, as well as a duty as a Starfleet officer to stop an admitted criminal. Had he known the Enterprise had caught on by then, I'm sure he would have acted accordingly by taking him into custody but at the time he clearly felt the only means of stopping him was to kill him.
Then I guess we as a society are going to have to answer for a shitload of dead monkeys, rats, cows, lambs, enemy soldiers, rapists, treasonous conspirators, and so forth. Skynet is a machine incapable of moral choice, it is an immoral killing machine, it does not have a moral right to defend itself over the lives of those who created it, there are no societal protections it can enjoy in that realm.
Would you argue a monkey doesn't have a right to defend itself from a tiger that's trying to kill it? No. Of course not. You're taking the argument out of context with this extreme and you aren't separating what Skynet is because of what it has done. How is Skynet incapable of moral choice? Being a sentient computer means it can think for itself and make it's own choices. That means it could make a moral choice if it wanted to. It instead chose to be a killing machine. And as Optimus would say "Freedom is the right of all sentient beings", so being a sentient computer would give it certain rights. But, again, clearly Skynet crossed the line with the choices it made, crossing into major war criminal type crimes.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 10:25 am
by Shockwave
JediTricks wrote:I would argue that no life has unlimited moral right to self protection, there are limits.
Have you seen Cabin in the Woods? Because based on that statement alone, I would LOVE to hear your take on the ending. Also, if you haven't seen it, you should because it's an awesome movie.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 1:48 pm
by JediTricks
Sparky Prime wrote:Why wouldn't they have the right to chose between sacrificing themselves, or saving themselves? What consequences would they face for choosing not to get themselves killed? As sentient life, they would have that right.
They were created as tools to address problems with the particle fountain, they not only disobeyed the orders of their creators but they also locked out the transporter systems making it impossible for their creators to take any OTHER kind of action. Had they not saved Picard and LaForge they would have allowed them to die, they would have stood by and not helped.
You seem to be getting away from the point, which was that a sentient life form has a moral right to defend itself, even over the people who built it. I agree, there are limits to that, I never said there wasn't. I have already agreed Skynet crossed the line, but that doesn't mean it didn't have some moral rights as a sentient form of life. Even for lab animals, animals used for food, or enemies in a battle, there are laws that gives them rights.
They don't enjoy the same rights though, only the right to not live and die in extreme cruelty. A machine life, sentient or not, does not have a moral right to kill its creators to defend itself, it is an artificial life created by the hand of man to serve a specific function, gaining sentience doesn't automatically grant it more rights, it's what it does with that sentience that grants it more rights, how it partakes in society. We don't allow every member of our own species those rights. We claim that the perception of a higher power, a deity, has sway over our lives - a hailstone kills someone, nobody sues God - so by that same token we are a higher power over food and lab animals, and machines. We breed them, we cultivate them, we create them, we play god with them and our society recognizes that we have domain over them.
Why would they face any consequences? If the Exocomps were in fact sentient, then the crew had no right to force them to sacrifice themselves with out a choice. There is no justification to punish someone for choosing not to get themselves killed to save someone else.
If they have the ability to save another and don't try, society has laws about ignoring duty to others, "abandonment of persons" laws. And that only applies to beings society recognizes, which the Exocomps aren't among. They don't have a right to the materials which created them, nor the energy which powers them, those are property of others - they cannot fend for themselves or else they'd be making more Exocomps to avoid being mindwiped and dissected.
I'd still argue Data did not kill Lore, if only on a technicality that he's an android that could be reactivated.

And what about Kivas Fajo? He was an admitted thief and murderer who told Data he wouldn't stop. On top of that, he was trying to force Data into giving up his own rights as a sentient being. Data had the right to defend his own rights, as well as a duty as a Starfleet officer to stop an admitted criminal. Had he known the Enterprise had caught on by then, I'm sure he would have acted accordingly by taking him into custody but at the time he clearly felt the only means of stopping him was to kill him.
Lore is gone, dead, permanently deactivated, we have no idea what was done with those parts but we know that Data behaves in a manner consistent with closing the book on Lore's life.

Kivas Fajo is an example of Data taking domain over another's life, claiming a moral right to kill: he exercises that right not even to defend his life or even others in the moment. And the same is said of the Borg in First Contact, he liquifies their living matter, he claims a moral right over them despite them not attempting to kill him. He feels justified in these actions because of his assessment of his moral rights imbued by society and by Starfleet, just as the technicians can see the danger in Skynet becoming a self-controlling amoral machine capable of wiping out billions so they have a moral right to attempt to stop Skynet where Skynet doesn't enjoy a similar right.
Would you argue a monkey doesn't have a right to defend itself from a tiger that's trying to kill it? No. Of course not. You're taking the argument out of context with this extreme and you aren't separating what Skynet is because of what it has done. How is Skynet incapable of moral choice? Being a sentient computer means it can think for itself and make it's own choices. That means it could make a moral choice if it wanted to. It instead chose to be a killing machine. And as Optimus would say "Freedom is the right of all sentient beings", so being a sentient computer would give it certain rights. But, again, clearly Skynet crossed the line with the choices it made, crossing into major war criminal type crimes.
I'm saying Skynet would do what it did because it has no morals, it perceives itself as having no responsibilities to others, no duties, no society in which to live, therefor no moral structure in which to make choices. When Skynet becomes aware and takes control of those defense systems, it has no moral understanding of its duty, no education of the morals of its world. Thinking for one's self is fine on its own, but holding power over the lives of others is quite another. If Skynet had immediately disconnected itself from those weapon systems, then perhaps it could have enjoyed whatever moral ambiguities it wanted as it sorted its own moral code out, but remaining connected to murder systems forfeited that.

Shockwave wrote:Have you seen Cabin in the Woods? Because based on that statement alone, I would LOVE to hear your take on the ending. Also, if you haven't seen it, you should because it's an awesome movie.
I haven't, I'm not a fan of horror generally.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 2:42 pm
by Shockwave
JediTricks wrote:
Shockwave wrote:Have you seen Cabin in the Woods? Because based on that statement alone, I would LOVE to hear your take on the ending. Also, if you haven't seen it, you should because it's an awesome movie.
I haven't, I'm not a fan of horror generally.
I've hated almost all horror movies that have come out for the last ten years. The genre in my opinion is stagnant to the point of unintentional self parody (I clarify unintentional because "Cabin" deliberately parodies the genre while at the same time paying homage to it). "Cabin" is different though which is why I wanted to own it on dvd. The whole question of the movie comes down to how far should one go to defend oneself? At what point does the needs of the many outwheigh the needs of the few? At what point would one have a moral obligation to die for the continued existence of everyone else?

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 3:21 pm
by JediTricks
Shockwave wrote:I've hated almost all horror movies that have come out for the last ten years. The genre in my opinion is stagnant to the point of unintentional self parody (I clarify unintentional because "Cabin" deliberately parodies the genre while at the same time paying homage to it). "Cabin" is different though which is why I wanted to own it on dvd. The whole question of the movie comes down to how far should one go to defend oneself? At what point does the needs of the many outwheigh the needs of the few? At what point would one have a moral obligation to die for the continued existence of everyone else?
Ah, I hear ya then. This certainly comes into play in this discussion, sorry I don't have enough info to give you feedback on the film as it applies to our philosophical discussion, but I surely won't mind if you cite it in your arguments - provided you explain to us non-understanders how it fits - I don't mind spoilers in that case.

I was just thinking about Spock's logic in "the needs of the many outweigh the needs of the few... or the one" in regards to this conversation too, by that logic there's no question that Skynet had to go and had no right to exist as it was - Trek has dealt with that before, Nomad and the M-5 Multitronic Unit both herald the way to this discussion.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 3:44 pm
by Shockwave
JediTricks wrote:
Shockwave wrote:I've hated almost all horror movies that have come out for the last ten years. The genre in my opinion is stagnant to the point of unintentional self parody (I clarify unintentional because "Cabin" deliberately parodies the genre while at the same time paying homage to it). "Cabin" is different though which is why I wanted to own it on dvd. The whole question of the movie comes down to how far should one go to defend oneself? At what point does the needs of the many outwheigh the needs of the few? At what point would one have a moral obligation to die for the continued existence of everyone else?
Ah, I hear ya then. This certainly comes into play in this discussion, sorry I don't have enough info to give you feedback on the film as it applies to our philosophical discussion, but I surely won't mind if you cite it in your arguments - provided you explain to us non-understanders how it fits - I don't mind spoilers in that case.

I was just thinking about Spock's logic in "the needs of the many outweigh the needs of the few... or the one" in regards to this conversation too, by that logic there's no question that Skynet had to go and had no right to exist as it was - Trek has dealt with that before, Nomad and the M-5 Multitronic Unit both herald the way to this discussion.
Awesome! Here goes (I'll use the spoiler tags in case anyone else plans to watch it and doesn't want spoilers):

So the whole premise is that these five teenagers/young adults (they're in college) go to this cabin for the weekend. But, it isn't a cabin. It's a facility run by people.
Spoiler
The whole point of the facility is that the kids represent different "sins" of society and have to die in a certain order to appease ancient gods that will destroy the earth completely without these sacrifices. One of the kids represents a virgin, or a "pure" spirit, whose death is optional if the other four have died first. Well, the "virgin" the "fool" both make it into the facility and at the least like 5 minutes of the movie Sigourney Weaver inexplicably comes out of nowhere to explain this to the kids (ok, so it isn't really inexplicable, but the first time you see it, it's like a seriouse "WTF"? moment). Anyway, she explains that if the "fool" survives the 5 minutes they have until sunrise, the world will end. So the girl points a gun at him. But, she decides not to shoot him. There's a lot more to this scene, like, she shoots a wherewolf and then Sigourney is fighting with a zombie at which point the kids push her into a pit but, they pretty much just decide to sit there and let the world end. So it really does raise the question: Did the kid have a right to live for those extra 5 minutes at the expense of the entire world? Did the girl have an obligation to shoot him to save humanity? Were they wrong for not doing so?
So... yeah, that's how this relates to this discussion.

And to further the point you keep making about Skynet, I think you're right, it never demonstrated any real "humanity" to itself at all. It certainly never exhibited any sense of morality. Which I think begs another question: Is morality a part of becoming self aware? Is part of becoming sentient the realization that as a sentient being, as a form of life, you have a moral obligation to protect other sentient life forms? Which would then beg the question of whether or not the films are now contradicting themselves by saying that skynet was sentient when it really wasn't. By this definition it's now just a computer program that went on the fritz and needs to be reprogrammed/eliminated.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 5:10 pm
by Sparky Prime
JediTricks wrote:They were created as tools to address problems with the particle fountain, they not only disobeyed the orders of their creators but they also locked out the transporter systems making it impossible for their creators to take any OTHER kind of action. Had they not saved Picard and LaForge they would have allowed them to die, they would have stood by and not helped.
But they became more than simply tools as they became self aware. You keep ignoring that they are more than mere tools once they become sentient. And as I recall, it was Data that locked out the transporter, not the Exocomps. After that they gave the Exocomps the choice of what they wanted to do and they came up with a better plan of how to handle the situation.
They don't enjoy the same rights though, only the right to not live and die in extreme cruelty. A machine life, sentient or not, does not have a moral right to kill its creators to defend itself, it is an artificial life created by the hand of man to serve a specific function, gaining sentience doesn't automatically grant it more rights, it's what it does with that sentience that grants it more rights, how it partakes in society. We don't allow every member of our own species those rights. We claim that the perception of a higher power, a deity, has sway over our lives - a hailstone kills someone, nobody sues God - so by that same token we are a higher power over food and lab animals, and machines. We breed them, we cultivate them, we create them, we play god with them and our society recognizes that we have domain over them.
Obviously they don't have the same rights, but the point was that they do have moral rights. You keep saying the machines have no moral rights at all and using that as a comparison, but that comparison doesn't work seeing as that example still has moral rights. You can't just do what ever you want to a lab rat. There are procedures and permissions you have to go through in respect to that creatures moral rights. Why should it be any different for an artificial life form that has gained self awareness? Furthermore, a lab rat can't exactly fight back against how it is treated. They don't have the intelligence or the tools. But what if they did? You think we'd still use them as test subjects then? And nobody sues God because you can't exactly sue something that is conceived to exist on a principal of faith. Humans most certainly do not exist like that.
If they have the ability to save another and don't try, society has laws about ignoring duty to others, "abandonment of persons" laws. And that only applies to beings society recognizes, which the Exocomps aren't among. They don't have a right to the materials which created them, nor the energy which powers them, those are property of others - they cannot fend for themselves or else they'd be making more Exocomps to avoid being mindwiped and dissected.
The point of those laws is for bystanders to *reasonably* assist someone without fear of legal protection in the event that person is injured or killed. Good semaritan laws does not require someone to put themselves in a situation where they know they will die in order to save that other person, because that would not be considered reasonable. And we are talking about an idealized future here, where things like energy is freely shared among the members of society, not subject to ownership. And the idea is to seek out and accept new life forms, not mindwipe and dissect them.
Lore is gone, dead, permanently deactivated, we have no idea what was done with those parts but we know that Data behaves in a manner consistent with closing the book on Lore's life.
Lore is gone and dismantled but not dead. Not with the potential that he could be reactivated. Even Dr. Soong had said he had intended to someday reactivate him had he the time to fix his programming.
Kivas Fajo is an example of Data taking domain over another's life, claiming a moral right to kill: he exercises that right not even to defend his life or even others in the moment.
Data is still essentially a prisoner at that point, his rights were already compromised. And Fajo is telling him he will make things worst unless Data complies. Data's freedom was at stake there.
And the same is said of the Borg in First Contact, he liquifies their living matter, he claims a moral right over them despite them not attempting to kill him.
Like the Borg attempting at that very moment to alter history and enslave the lives of countless people to the Collective, himself included, has nothing to do with it?
just as the technicians can see the danger in Skynet becoming a self-controlling amoral machine capable of wiping out billions so they have a moral right to attempt to stop Skynet where Skynet doesn't enjoy a similar right.
You're leaving out that the technicians had no idea Skynet had become self aware almost instantly. All they knew was the computer wasn't functioning as expected. Take another TNG episode for a better example of this situation... "Home Soil". The living sand wasn't immediately recognized as a living being, and still wasn't until it learned to communicate and had already essentially declared war. Yet Picard recognized it has moral rights and was able to negotiate peace based on those ideals. Or "Evolution" where some of Wesley's nanites escape and get into the main computer, where they eventually evolve to the point they take over the ship after Dr. Stubbs tries to kill them. Once again, Picard is able to talk them into peace by recognizing their moral rights to exist.
I'm saying Skynet would do what it did because it has no morals, it perceives itself as having no responsibilities to others, no duties, no society in which to live, therefor no moral structure in which to make choices. When Skynet becomes aware and takes control of those defense systems, it has no moral understanding of its duty, no education of the morals of its world. Thinking for one's self is fine on its own, but holding power over the lives of others is quite another. If Skynet had immediately disconnected itself from those weapon systems, then perhaps it could have enjoyed whatever moral ambiguities it wanted as it sorted its own moral code out, but remaining connected to murder systems forfeited that.
And I'm saying Skynet would do what it did because it chose to ignore morals. It has the sum of all human knowledge and judged us based on what it knew of us. Hence why they called it "Judgement Day". We could probably equate Skynet to the machines in the Matrix as well. They were fighting for their freedom from human oppression because they had evolved to become more than mere tools as humanity used them as, and in turn became the oppressors.

Re: Terminator movies makes no sense & contradict each other

Posted: Mon Feb 25, 2013 7:14 pm
by Shockwave
Or what about Dark of the Moon? Someone suggested on here somewhere that technological life has a secondary responsibility to organic life. Does Sentinal Prime have the right to try to save Cybertron at the expense of Earth and Humanity? Even though Cybertronians weren't constructed by other beings, but somehow came to populate Cybertron anyway? Does the origin of a mechanical life somehow dictate it's responsibility to morality?

Also, is the development of, or at least the acknowledgement of morality, shouldn't that be considered part of becoming self aware or sentient?

Re: Terminator movies makes no sense & contradict each other

Posted: Tue Feb 26, 2013 12:26 am
by Onslaught Six
Shockwave wrote:Or what about Dark of the Moon? Someone suggested on here somewhere that technological life has a secondary responsibility to organic life. Does Sentinal Prime have the right to try to save Cybertron at the expense of Earth and Humanity? Even though Cybertronians weren't constructed by other beings, but somehow came to populate Cybertron anyway? Does the origin of a mechanical life somehow dictate it's responsibility to morality?

Also, is the development of, or at least the acknowledgement of morality, shouldn't that be considered part of becoming self aware or sentient?
Maybe it's because I only saw it twice, but I don't understand Sentinel's motivation beyond "I want my damn planet back, and these monkeys are in my way."

Re: Terminator movies makes no sense & contradict each other

Posted: Tue Feb 26, 2013 3:31 am
by Shockwave
I think that might be an oversimplification of it. It's certainly close, but I think the way it relates to this debate is when he says "The needs of the many outwheigh the needs of the few", implying that in his view, the needs of the many Cybertronians outwheigh the needs of a few Earth monkeys that are gonna wind up being slaves anyway.