From:
Marc LeBrun <mlb(a)well.com>
> Don't recall this getting discussed here: https://arxiv.org/pdf/1912.01412.pdf
Amazing results. My only gripe is that it's not clear how much processing
time was used by the neural net, but they gave, e.g., Mathematica 30 seconds.
> Interestingly treats symbolic integration as a kind of linguistic translation.
Well, they treat it in the same way people treat linguistic translation:
with a sequence-in, sequence out interface. Here they encode both as forward
polish strings.
If you think of a neural net as like a matrix multiplication, you can do a
state transition including an input and/or output in a single step.
It's actually quite a bit like the different transforms of an
IFS fractal. A friend of mine was pushing in and reading out balanced
parenthesis strings (known to be not-absolutely-trivial) using
a hand-designed 2D IFS around Y2K and I thought it was pointless.
https://simondlevy.academic.wlu.edu/files/publications/bics2004.pdf
> Leverages the cost asymmetry "trap door" versus differentiation when training.
Do you mean the way they generate problem-solution pairs by starting with
the solution and differentiating to get the problem? (I don't remember
which sage said that intelligence amounts to inverting functions.)
--Steve