Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm honestly confused as to why it is doing this and why it thinks I'm right when I tell it that it is incorrect.

I've tried asking it factual information, and it asserts that it's incorrect but it will definitely hallucinate questions like the above.

You'd think the reasoning would nail that and most of the chain-of-thought systems I've worked on would have fixed this by asking it if the resulting answer was correct.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: