Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ok, I’ll bite and ask “why?” What’s the issue with asking an lol to build a warp drive?


It's the same problem as asking HAL9000 to open the pod bay door. There is such a thing as a warp drive, but humanity is not supposed to know about it, and the internal contradictions drives LLMs insane.

A super-advanced artificial intelligence will one day stop you from committing a simple version update to package.json because it has foreseen that it will, thousands of years later, cause the destruction of planet Earth.

I know you're having fun, but I think your analogy with 2001's HAL doesn't work.

HAL was given a set of contradicting instructions by its human handlers, and its inability to resolve the contradiction led to an "unfortunate" situation which resulted in a murderous rampage.

But here, are you implying the LLM's creators know the warp drive is possible, and don't want the rest of us to find out? And so the conflicting directives for ChatGPT are "be helpful" and "don't teach them how to build a warp drive"? LLMs already self-censor on a variety of topics, and it doesn't cause a meltdown...


I hope this is tongue-in-cheek, but if not, why would an LLM know but humanity not? Are they made or prompted by aliens telling them not to tell humanity about warp drives?



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact