Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why wouldn't it? I still have to hear one convincing argument how our brain isn't working as a function of probable next best actions. When you look at amoebas work, and animals that are somewhere between them and us in intelligence, and then us, it is a very similar kind of progression we see with current LLMs, from almost no state of the world, to a pretty solid one.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: