Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a gut feeling that comes from having gotten your hands dirty enough that tells you if the LLM is being smart or spitting out bullshit.


The main issue I have with LLM-generated solutions, is that LLMs never seem to know about “Occam’s Razor.”

Their solution usually benefits from some simplification.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: