Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm directly responding to a comment discussing the popular perception that we, as a society, are "steps away" from AGI. It sounds like you agree that we aren't anywhere close to AGI. If you want to discuss the potential for LLMs to disrupt the economy there's definitely space for that discussion but that isn't the comment I was making.


Whether we should call what LLMs do “knowing” isn’t really relevant to how far away we are from AGI, what matters is what they can actually do, and they can clearly do at least some things that show what we would call knowledge if a human did it, so I think this is just humans wanting to feel we’re special


>they can clearly do at least some things that show what we would call knowledge if a human did it

Hard disagree. LLMs merely present the illusion of knowledge to the casual observer. A trivial cross examination usually is sufficient to pull back the curtain.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: