[math-fun] "Transformer model" machine learning tool outperforms Mathematica on integrals and diffeqs
This is the same kind of machine learning system that's been in the news lately for doing so much better at writing comprehensible English. https://openreview.net/pdf?id=S1eZYeHFDS -- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
Section 3. What is the motivation for this particular function space? Where should we expect to find these type of functions in the wild? Unless there are good answers to these questions, wouldn't an experiment on Holonomic / D-Finite functions generate more interest? Or what about hypergeometric functions? Section 4. Despite development of S3. and appendix B, section 4.1 - 4.2 does not give a straightforward estimate, as a percentage, of the ratio for training sample size to total sample size (unless it is somehow lost in Jargon). Somewhere there should also be stats about hardware, install time, and library size. Are the calculations running on a personal computer or a super computer? These basic facts are more important to give the reader than the repeat info of appendix B (not necessary or helpful in my opinion). Table 3. Example 3. This is obviously a D-finite function. The corresponding ODE is easy to calculate and solve using Mathematica. Another solution, y=1/x, falls out of the calculation. Why not include 1/x? Overall: Competition stats look promising, but could use more straightforward explanation. For a non-specialist, Jargon is not helpful to explain how much pre-computing is necessary relative to algorithm performance. Without motivation for the function space, the work risks getting labelled as "meaningless symbol crunching". However, you have to start somewhere, and if this technique works well on other function spaces, then subsequent follow-ups could probably become more useful to science and math. --Brad On Fri, Sep 27, 2019 at 12:15 PM Mike Stay <metaweta@gmail.com> wrote:
This is the same kind of machine learning system that's been in the news lately for doing so much better at writing comprehensible English.
https://openreview.net/pdf?id=S1eZYeHFDS -- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (2)
-
Brad Klee -
Mike Stay