This is a large project, because the amount of infrastructure that it needs is enormous. Just like the development of a large SW system like MS Windows involves a huge number of sub-parts, the infrastructure of a mathematics system would involve tens of thousands, if not hundreds of thousands of lemmas & theorems. Bob Boyer tells me that even the current successes of proof checkers involve thousands of lemmas. One of the tasks of such a project would be to name/index all of the lemmas in mathematics (in a manner similar to the naming/indexing of all Integer Sequences). As some have already speculated, this task involves the formalization of all known math -- a huge undertaking. Once such a framework exists, the amount of additional machinery for a new theorem would be (more-or-less) proportional to the size of the English/French/whatever paper convering the same material. (An analogy is the implementation of a standard -- e.g., MPEG4. The size of the SW to implement the standard is typically proportional to the size of the published standard in English. Of course, this depends upon the existence of large subroutine/function libraries to implement common algorithms not spelled out in the standard.) At 11:12 AM 3/3/2006, franktaw@netscape.net wrote:
Do you have any idea how far published mathematical proofs are from this level? I would guess that a typical published proof would require millions or even billions of statements.
So any such project would have to operate at a higher level than that. And now you come to another problem: groundbreaking results, and a large fraction of those that aren't groundbreaking, involve the introduction of one or more new ideas. And those ideas would have to be introduced to the proof machinery before it could verify the result - at which point the utility of the exercise is very much in question.