I observe that the adoption of numerical symbols distinct from the writing symbols is a major cultural and mathematical advance. The main benefit is the ability to freely intermix words and numbers with little chance of misinterpretation. Clearly, a separate numerical system is superior to a system in which numbers must be represented by words, or in which letters double as numbers. I have always thought that the use of letters for digits in higher-base numbers (i.e, A = 10, B = 11, C = 12, etc.) is rather a step backwards. The letter values are unintuitive, words and numbers become confusable, a fairly universal digit set is extended with culturally proprietary symbols, numbers now have case issues, and for all that we run out of symbols again at base 37. An extensible set of digits would be useful. For example, the digit 10 might be represented by a glyph incorporating the number 10 (a circled, boxed, barred, or otherwise joined digits of 10). Such a glyph system would be extensible to any reasonable base, and digit values would be unmistakable wherever Arabic numerals are recognized. I was wondering, prior to the computer age, were higher bases used widely? Was there an accepted representation for higher-base numerals? Did A = 10, B = 11, etc, arise before the computer age, or did it arise out of hexadecimal computer applications? I always assumed it was just a pragmatic adoption of the most suitable ASCII characters as additional digits. -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 266.2.0 - Release Date: 2/21/2005