I'm trying to figure why the person making the post failed. There are existing Haskell libraries for automatic differentiation [1].
As opposed to ordinary symbolic differentiation or numerical differentiation, automatic differentiation is a method that takes the code of a function and a value and returns the derivative of the function at run time.
Of course, the author is working with a vector library and so his version will take N-dimension vectors/tensors and a functions of these and return N+something dimension vectors/tensors. That part is kind of head spinning and I assume that's where things fell apart.
Well if you look at the categorical semantics of differentiation, it takes in a function A -> B and spits out AxA -> B. It’s a fairly well developed area of logic/type theory (even at the level of term rewriting systems), it seems the author of the blog post missed it. I probably would have emailed him some references if I’d seen the initial announcement...
Because this was intended to support `accelerate` and uses the variable-mode ad approach presented in 'the simple essence of automatic differentiation' which afaik has not been implemented in conjunction with `accelerate` .
(Your comment reads quite uncharitable/mean btw, hard to tell if it was intentional)
My HP 48G(X) from the 1990's did automatic symbolic differentiation and integration. And, IIRC, there's probably a textbook in either Fortran or C for symbolic mathematical evaluation (partial evaluation, simplification and Monte Carlo fallback) and transformations. We had to deal a lot with this in the code used by a nuclear reactor simulator (in Fortran) that effectively did glorified finite element analysis with some actual direct calculations here and there.
pow x 0 = 1
pow x n = x * pow x (n-1)
ergo
pow' x 0 = 0
pow' x n = pow x (n-1) + x * pow' x (n-1)
ergo
pow' x 1 = pow x 0 + x * pow' x 0
= 1 + x * 0
= 1
pow' x 2 = pow x 1 + x * pow' x 1
= x + x
= 2 * x
pow' x 3 = pow x 2 + x * pow' x 2
= x * x + 2 * x * x
= 3 * x * x
As opposed to ordinary symbolic differentiation or numerical differentiation, automatic differentiation is a method that takes the code of a function and a value and returns the derivative of the function at run time.
Of course, the author is working with a vector library and so his version will take N-dimension vectors/tensors and a functions of these and return N+something dimension vectors/tensors. That part is kind of head spinning and I assume that's where things fell apart.
[1] http://hackage.haskell.org/package/ad