While I did not care particularly for the story, It does raise several interesting philosophical questions.
For one, if we ever gain the technological abilities portrayed in Lighting Out, should the partials created have the same rights as naturally born humans? In the work, Constance alludes to philosophers (both natural human and partial) who have decided that partials should be "deleted" as soon as their functions have been served. However, I don't know that I could personally justify killing a living, thinking being, even if it is only a copy of another.
Furthermore, what if we can download and transfer our consciousnesses, but only to machines? Should these machines have rights as sentient beings, even though they're not alive?
This is a really difficult issue, especially since it encompasses so many different issues--chief among them is how do you define human? Better yet, should humans be regarded as the only beings worth saving or having? This reminds me of what scientists argue against the PETA members, that it's better to use animals to experiment on instead of humans because humans are worthier. But how can that be a unbiased judgement since there are no views from other sentient beings? When the day comes that a robot or animal surpasses humans, we may want to be able to argue that we treated all other beings with dignity.
ReplyDelete