Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Keep in mind that many (most? all?) LLMs are not trained strictly on factual data, and are probably not trained to differentiate between factual and non-factual information. If you ask an absurd question you will likely get a (delightfully) absurd answer. If you ask a question somewhere in the borders between reality and fiction... results may vary.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact