Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So for me, it's not about being reductionist, but about not anthropomorphizing or using words which which may suggest an inappropriate ethical or moral dimension to interactions with a piece of software.


I'm the last to stand in the way of more precise terminology! Any ideas for "lying to a moral non-entity"? :)

“Lying” traditionally requires only belief capacity on the receiver’s side, not qualia/subjective experiences. In other words, it makes sense to talk about lying even to p-zombies.

I think it does make sense to attribute some belief capacity to (the entity role-played by) an advanced LLM.


I think just be specific - a suicidal sixteen year-old was able to discuss methods of killing himself with an LLM by prompting it to role-play a fictional scenario.

No need to say he "lied" and then use an analogy of him lying to a human being, as did the comment I originally objected to.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact