Actually, Church's lambda calculus (more-or-less the basis of Lisp) tells you much more about the concept of a 'variable' than does algebra. The lambda calculus is essentially the theory of variable substitution. The main axioms: Alpha. The name of a variable is irrelevant, so long as it doesn't clash with other variables. So you can "rename" a variable to your heart's content. Beta. You can substitute the value of a variable in a way that doesn't interfere with the binding of existing variables. This axiom is tricky, because it may require judicious use of axiom alpha in order to avoid name "capture". Eta (I think this is the name?). Tail recursion. You can replace recursion by iteration in certain contexts. The lambda calculus is a far more beautiful theory of computation than Turing Machines. It's an historical accident that the theory of computation is centered around TM's than lambda's. BTW, who was the fellow at MIT in the early 1970's who used Lisp to teach computation theory? He had a very elegant proof of undecidability using a Lisp interpreter. At 07:30 AM 8/2/2012, Adam P. Goucher wrote:
Also, the concept of a 'variable' is introduced in algebra, so pretty much all of modern computer science (post-Turing, that is*) relies on algebra.