Here is an idea about teaching calculus that I have been pondering for a while, wondering what others might think of it (or if I have reinvented the square wheel?) The concept of "lim F(x) as x approaches a" is (pedagogicaly) problematic since we need artificial-looking examples to get this limit to be something other than just F(a). I'm thinking the first time students encounter limits, it should be the limit of a sequence {a_n} as n approaches infinity. Then we can define the derivative F'(a) as the limit of the sequence (F(a + d_n) - F(a))/d_n, with F being differentiable at a iff this limit exists and is the same for all sequences d_n that approach zero. Limits at infinity seem like the right place to start for a couple of reasons. First, it is possibly more clear that we really do need a new concept to talk about what happens "at infinity". Also, the two variables over which we are quantifying are more clearly playing distinct roles, since one is a large integer and the other is small. And the proposed definition of the derivative is closer to the way mathematicians actually think about such things -- the limit of an ever-improving sequence of approximations.