Why wouldn't they have the right to chose between sacrificing themselves, or saving themselves? What consequences would they face for choosing not to get themselves killed? As sentient life, they would have that right.JediTricks wrote:Just because the Exocomps did the same thing doesn't mean they had the right to, had they not come up with a better plan to save Picard and LaForge they'd have suffered the consequences for those actions. Hence they didn't assert the right to protect themselves by killing others. They especially didn't assert that right over the entirety of all living things.
You seem to be getting away from the point, which was that a sentient life form has a moral right to defend itself, even over the people who built it. I agree, there are limits to that, I never said there wasn't. I have already agreed Skynet crossed the line, but that doesn't mean it didn't have some moral rights as a sentient form of life. Even for lab animals, animals used for food, or enemies in a battle, there are laws that gives them rights.I would argue that no life has unlimited moral right to self protection, there are limits. I would further argue that we as a society hold final say over what lifeforms enjoy what level of limitations on those rights, we do so every time we do animal testing or eat carnivorously or kill an enemy in battle.
Why would they face any consequences? If the Exocomps were in fact sentient, then the crew had no right to force them to sacrifice themselves with out a choice. There is no justification to punish someone for choosing not to get themselves killed to save someone else.I absolutely am looking at the other side of the story, but if they had let Picard and LaForge die to protect themselves they would have not continue to enjoy any level of rights at all, they would have faced severe consequences - sentient or not.
I'd still argue Data did not kill Lore, if only on a technicality that he's an android that could be reactivated.Ok, I used hyperbole, there ya have it. He still killed his brother, he said goodbye and made it impossible for Lore to reactivate himself. And what about his sitting judgement over Kivas Fajo?
And what about Kivas Fajo? He was an admitted thief and murderer who told Data he wouldn't stop. On top of that, he was trying to force Data into giving up his own rights as a sentient being. Data had the right to defend his own rights, as well as a duty as a Starfleet officer to stop an admitted criminal. Had he known the Enterprise had caught on by then, I'm sure he would have acted accordingly by taking him into custody but at the time he clearly felt the only means of stopping him was to kill him.
Would you argue a monkey doesn't have a right to defend itself from a tiger that's trying to kill it? No. Of course not. You're taking the argument out of context with this extreme and you aren't separating what Skynet is because of what it has done. How is Skynet incapable of moral choice? Being a sentient computer means it can think for itself and make it's own choices. That means it could make a moral choice if it wanted to. It instead chose to be a killing machine. And as Optimus would say "Freedom is the right of all sentient beings", so being a sentient computer would give it certain rights. But, again, clearly Skynet crossed the line with the choices it made, crossing into major war criminal type crimes.Then I guess we as a society are going to have to answer for a shitload of dead monkeys, rats, cows, lambs, enemy soldiers, rapists, treasonous conspirators, and so forth. Skynet is a machine incapable of moral choice, it is an immoral killing machine, it does not have a moral right to defend itself over the lives of those who created it, there are no societal protections it can enjoy in that realm.

