Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

more like an artefact of the inability to lie than a hallucination


No analogy needed. It's actually because "Yes it exists" is a linguistically valid sentence and each word is statistically likely to follow the former word.

LLMs produce linguistically valid texts, not factually correct texts. They are probability functions, not librarians.


Those are not two different things. A transistor is a probability function but we do pretty well pretending it's discrete.

Transitors at the quantum level are probability functions just like everything else is. And just like everything else, at the macro level the overall behavior follows a predictable known pattern.

LLMs have nondeterministic properties intrinsic to their macro behaviour. If you've ever tweaked the "temperature" of an LLM, that's what you are tweaking.


Temperature is a property of the sampler, which isn't strictly speaking part of the LLM, though they co-evolve.

LLMs are afaik usually evaluated nondeterministically because they're floating point and nobody wants to bother perfectly synchronizing the order of operations, but you can do that.

Or you can do the opposite: https://github.com/EGjoni/DRUGS


this was no analogy, it really can't lie...



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact