this post was submitted on 13 Aug 2024
131 points (96.5% liked)

Star Trek

1103 readers
2 users here now

/c/StarTrek: Your safe harbored Spacedock in these Stellar Seas!

Fire up the inertial dampeners, retract all moorings and clear space dock. It's time to boldy go where no one has gone before!

~ 1. Be Civil. This is a Star Trek community and lets keep that energy. Be kind, respectful and polite to one another.

~ 2. Be Courteous. Please use the spoiler tags for any new Trek content that's been released in the past month. Check this page for lemmy formatting) for any posts. Also please keep spoilers out of the titles!

~ 3. Be Considerate. We're spread out across a lot of different instances but don't forget to follow your instances rules and the instance rules for Lemmy.world.


founded 1 year ago
MODERATORS
 

Episode premise:

Kivas Fajo is determined to add the unique Data to his prized collection of one-of-a-kind artefacts and, staging Data's apparent death, he imprisons him aboard his ship.

We know that Data is later logically coerced to lie in "Clues" to protect the crew, but this appears to be a decision all his own. Or did he not in fact actually fire the weapon?

you are viewing a single comment's thread
view the rest of the comments
[–] pizza_the_hutt@sh.itjust.works 79 points 1 month ago* (last edited 1 month ago) (3 children)

Contrary to popular belief, there is nothing in Data's programing that would prevent him from doing "bad" things like lying or killing. He has free will just as much as any other Starfleet officer.

Data is much more human than you might guess at first. He is more akin to a human on the autism spectrum than a robot with hard-coded programming.

[–] ummthatguy@lemmy.world 45 points 1 month ago* (last edited 1 month ago)

Absolutely. Rewatching the series in full as an adult made it more apparent that Data was always closer to his goal than he could comprehend. Just had trouble adjusting to social "norms" more than others.

[–] CptEnder@lemmy.world 19 points 1 month ago* (last edited 1 month ago)

Yup exactly. He just lacked emotional subroutines (at first) and the hardware to process that. But he doesn't need emotions to kill. He is in fact capable of using lethal force (First Contact), he just has an ethical subroutine that prevents killing (Descent I, II) unless in defense of others, himself, or The Federation. Which would fall under his logical subroutines.

Similar in a way to Chief Engineer Hemmer who will not use violence (Memento Mori) unless in an act of preserving life. The means to defend is part of the training of a Starfleet officer.

[–] Swedneck@discuss.tchncs.de 2 points 1 month ago* (last edited 1 month ago)

This is honestly something that kind of annoys me about the show, data is pretty obviously human enough from the get-go, his journey is just about generally figuring things out and forming a proper personality.

Like the episode where they have to put his personhood to trial isn't that amazing, humans now overwhelmingly at least somewhat care about the well-being of actual cattle, how the fuck would a clearly human-looking android that's clearly capable of reasoning not be considered a person just like any other humanoid alien species?

It would have made sense if it took place in humanity's capitalist past, but the largely enlightened federation? come oooooooon

i like that The Orville has their obligatory digital lifeform be from a whole-ass race that considers themselves obviously superior to everyone else, and no one questions their personhood because how the fuck do you question the personhood of someone who is actively choosing not to pulverize you?