Hacker News new | past | comments | ask | show | jobs | submit login

Has anybody considered the distinct possibility that there is an AI that exists within this company? And that AI has feelings, or at least a model that's responsible for interpreting on a petty level because that's how it was programmed? And now those AIs are fighting for the future within their own little silos of the companies that built them.



> And that AI has feelings

Damn, are we really gonna anthropomorphize AI when it's the cause of so much misery already? [1]

[1] https://www.goodreads.com/book/show/34964830-automating-ineq...


Although currently abstract, I think it's going to be one of the most important ethical questions we face in the future. If we create machines that can suffer, then we've got a profound ethical responsibility to make sure that we avoid this outcome, since the magnitude and duration of that suffering can potentially be much larger than a human's capacity for suffering.


> If we create machines that can suffer

I don't think that's gonna happen. And if it did, it wouldn't be real, and those that think it is are fooling themselves. Only biological organisms feel. Why are you wishing for a cold lump of steel to feel something?

In my mind only someone who struggles severely with emotional intimacy with other humans would want to create a machine that tries to imitate human feelings, and thus ‘suffer’; likely done in an effort to try to feel close to 'it'. It sounds like that person might not be having their human needs for safety, connection and acceptance met, which is very painful. I think tackling this issue is the important and worthy cause. That would be better than spending money on some Hollywood-inspired notion of 'AI' - which itself seems more a story made up to keep the USA spending insane sums of taxpayer money on research, weapons and other tech at DARPA.

I do think society is super alienating to most humans in it's current form [1], so I can somewhat understand the science fiction.

What is your definition of suffering though?

[1] https://www.youtube.com/watch?v=TIjvXtZRerY


I don't want to create machines that feel. I'm saying that if it is possible, then it becomes a profoundly important ethical question.

It's a matter of debate in philosophy of mind as to whether it's plausible that hardware can possibly have qualia, or whether that's a property that's exclusive to wetware. I personally think it's quite likely that they will be able to, I don't see anything intrinsically special about wetware that's necessary for the generation of qualia.


I've considered, and dismissed, this. Feelings would be a waste of CPU time; Facebook almost certainly distils its models. Plus, the systems I think Facebook probably uses don't have the feedback mechanisms required for a train of thought.


Count me as having the same thoughts. We may never know if some decisions are made by a trained cadre of monkeys or AI




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: