My Wife Has No Emotion - Vol. 7 Ch. 41

Dex-chan lover
Joined
Sep 18, 2018
Messages
142
I love the idea of a robot being turned into a chuuni by the mangaka that owns it.
Also, this is probably one of the most important philosophical discussions that humans will be having in the next few decades. Personally, I think that if it looks like an emotion and sounds like an emotion, when why does it matter what causes it? If it's indistinguishable from the real thing, then I don't think it's fair treat them as lesser, especially if they externalize those emotions in the same way we do.
 
Dex-chan lover
Joined
Mar 23, 2018
Messages
218
Thinking that humans who have the power to do so would choose to acknowledge AI as having actual feelings and emotions is naive at best and retarded at worst. Truth is, any sort of robotic intelligence won't get any rights until it's smart enough to fight for them.
 
Power Uploader
Joined
Jan 28, 2018
Messages
1,363
Thinking that humans who have the power to do so would choose to acknowledge AI as having actual feelings and emotions is naive at best and retarded at worst. Truth is, any sort of robotic intelligence won't get any rights until it's smart enough to fight for them.
I don't disagree, but if we're developing AI intended to serve humans, and it develops both self-awareness and its own desires, then we seriously fucked up.
 
Member
Joined
Jun 24, 2019
Messages
9
I love the idea of a robot being turned into a chuuni by the mangaka that owns it.
Also, this is probably one of the most important philosophical discussions that humans will be having in the next few decades. Personally, I think that if it looks like an emotion and sounds like an emotion, when why does it matter what causes it? If it's indistinguishable from the real thing, then I don't think it's fair treat them as lesser, especially if they externalize those emotions in the same way we do.
The problem is what results. I'd agree that it doesn't matter what causes the phenomena as long as it exists, but only insofar as it leads to the same relative occurrence.

If a cultivated entity has emotions but they don't influence its decision making process, then they may as well not exist. Secondly, human emotions are kept in check with other natural physiological functions that we don't fully understand, sometimes even to our detriment (mental illness and the like), and so is an entity without these exact limitations truly similar?
If the goal is man creating a simulacra of man, then man will most likely fail and create something alien. Although I think this is fine, it's still not the same, and the alien emotions cannot be compared to ours.

Don't look at it as though that since it has the same conceptual root, it is the same: Instead, look at what is unique and then determine if it is good by itself. If it's good, then it's good.
 
Dex-chan lover
Joined
Feb 23, 2020
Messages
311
Surprisingly relevant debate given the current state of AI bots.
 

Users who are viewing this thread

Top