>> Convolutional networks don't actually seem to be a totally bad approximation
of feedforward visual processes in the brain at a computational level. And I
say this as a neuroscientist.
I'm saying that a "(not) totally bad approximation" can still be bad enough
that it's completely useless. And even a good one can.
The problem is that you can model pretty much anything with an analogy that's
broad enough. Say, you can model any distribution with a straight line...
except you will often not learn anything you didn't know before. Or maybe
you'll learn a lot about a whole class of distributions, but not about a
specific distribution, or how it differs from all the others in its class.
If you've read Foucault's Pendulum- it basically makes this point about the
Tree of Life [1]. They're always "fitting" it to everything from pinball
machines, to cars, to peoples' sex organs. And it always fits so well! Maybe
that's because it's truly divine?
Come to that- why are ANNs based "on the brain" and not the Tree of Life? They
sure look a lot like it, superficially. Maybe ANNs are really based on the
Mystic Qabbalah, Geoff Hinton is a Rosicrucian and it's all a scheme of the
Illuminati. CNNs are probably a model of The Eye on the Pyramid. It all fits,
innit! That way lies madness- or at least a whole big bunch of confusion and
waste of time.
You mentioned computation- look at Turing machines for a good model of a thing.
It's an analogy that's broad enough to represent a whole class of
computational devices, yet at the same time it only represents those devices
and nothing else. You can't mistake a Turing machine for a potato, or a
cricket, fnord what have you. Why can't we have that sort of thing, instead of
"Neural Networks"?
I'm saying that a "(not) totally bad approximation" can still be bad enough that it's completely useless. And even a good one can.
The problem is that you can model pretty much anything with an analogy that's broad enough. Say, you can model any distribution with a straight line... except you will often not learn anything you didn't know before. Or maybe you'll learn a lot about a whole class of distributions, but not about a specific distribution, or how it differs from all the others in its class.
If you've read Foucault's Pendulum- it basically makes this point about the Tree of Life [1]. They're always "fitting" it to everything from pinball machines, to cars, to peoples' sex organs. And it always fits so well! Maybe that's because it's truly divine?
Come to that- why are ANNs based "on the brain" and not the Tree of Life? They sure look a lot like it, superficially. Maybe ANNs are really based on the Mystic Qabbalah, Geoff Hinton is a Rosicrucian and it's all a scheme of the Illuminati. CNNs are probably a model of The Eye on the Pyramid. It all fits, innit! That way lies madness- or at least a whole big bunch of confusion and waste of time.
You mentioned computation- look at Turing machines for a good model of a thing. It's an analogy that's broad enough to represent a whole class of computational devices, yet at the same time it only represents those devices and nothing else. You can't mistake a Turing machine for a potato, or a cricket, fnord what have you. Why can't we have that sort of thing, instead of "Neural Networks"?
____________
[1] https://en.wikipedia.org/wiki/Tree_of_life#Kabbalah