@gronkle—
if you are sure that conscious beings as a class cannot be moral agents
At issue wasn't what I believe with certainty, but the basis for the certainty of some that robots cannot have consciousness. Typically, that basis is exactly the same as the basis that some have for believing that the apparent consciousness of human beings doesn't have the character that others impute to it.
Your earlier claim was
Robots do not have moral agency, and are not entitled to moral consideration.
And, as I said, this begged a central question. Now you write
Her *actual* moral agency is always up for debate, because she is not alive.
Even with this far weaker claim, you are still at least tolerating a begging of the question. Again, there are people who will insist that the distinction between things that we classify as
biologic and machines isn't based on an objectively essential distinction, that organisms and machines are reducible to the same fundamental components and processes. And there will be those who insist that robots
may have all the relevant properties of whatever is meant by “alive” and that Ponko
is to that extent alive.
In any case, whatever may be the real-world limitations of electronic artificial intelligence, the intention is that the audience see Ponko as a
person, even if one sees it as comedic that personhood should arise in such a form.
The fact that an old grumpy man gives her that agency despite that is a central theme.
More accurately put, the theme is that he increasing adopts such an imputation. (There are many analogous works of science fiction in which one of two lifeforms increasingly adopts such an imputation about the other. And there is no small number of works outside of science fiction about persons of one ethnic group increasingly imputing personhood to a member of another.) But that point is in no sense a refutation of my earlier remarks, nor otherwise a substantiation of your initial claim that I've somehow misunderstood this story.