Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting, what about the old proof that neural networks can't model arbitrary length sine waves?




I don't know that computers can model arbitrary length sine waves either. At least not in the sense of me being able to input any `x` and get `sin(x)` back out. All computers have finite memory, meaning they can only represent a finite number of numbers, so there is some number `x` above which they can't represent any number.

Neural networks are more limited of course, because there's no way to expand their equivalent of memory, while it's easy to expand a computer's memory.


Here's the paper for your interest

https://arxiv.org/abs/2006.08195


That proof only applies to fixed architecture feed forward multilayer perceptrons with no recurrence, iirc. Transformers are not that.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: