Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reduce the parameter count. Increase the acceptable error.

The more parameters on a curve fit the better the fit will be, but the compute power increases too.



Is that possible? I thought parameter counts were fixed in the model.


There are always ways to trade precision for speed in computer statistics models.


Sure, generally speaking. Is that true for static, fixed parameter count LLMs like GPT4?

I think you're hand waving a lot just to claim that OpenAI are (somehow) reducing accuracy of their models during high load. And I'm not sure why.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: