On Dec 20, 2015, at 8:21 PM, James Propp <jamespropp@gmail.com> wrote:
And even more, I'd like to see a compendium of such adverse outcomes, so that any time I want to warn the students away from a particular kind of mistake, I can say something like "If you make this mistake on my exam, you might lose points. And if you make this mistake after you graduate, you might kill hundreds of people."
Those students will increasingly be using automated tools for avoiding those fatal minus signs, so diligence of that sort may not be what your students need the most. A more troublesome thing, in my mind, is illustrated by a correspondence I had with a young research intern at the Federal Reserve (the US institution that sets interest rates). The question concerned a line of code in a software package that was developed in-house. This line of code computes the logarithm of a number, but first checks that the number is positive. When the check fails, the logarithm is replaced by a fixed constant. The intern wanted to know the rationale for choosing this constant. I think I did the right thing by pointing out to the intern that the code surely suffers from more serious problems if that check fails with any regularity!