I observe that the adoption of numerical symbols distinct from the writing symbols is a major cultural and mathematical advance. The main benefit is the ability to freely intermix words and numbers with little chance of misinterpretation. Clearly, a separate numerical system is superior to a system in which numbers must be represented by words, or in which letters double as numbers. I have always thought that the use of letters for digits in higher-base numbers (i.e, A = 10, B = 11, C = 12, etc.) is rather a step backwards. The letter values are unintuitive, words and numbers become confusable, a fairly universal digit set is extended with culturally proprietary symbols, numbers now have case issues, and for all that we run out of symbols again at base 37. An extensible set of digits would be useful. For example, the digit 10 might be represented by a glyph incorporating the number 10 (a circled, boxed, barred, or otherwise joined digits of 10). Such a glyph system would be extensible to any reasonable base, and digit values would be unmistakable wherever Arabic numerals are recognized. I was wondering, prior to the computer age, were higher bases used widely? Was there an accepted representation for higher-base numerals? Did A = 10, B = 11, etc, arise before the computer age, or did it arise out of hexadecimal computer applications? I always assumed it was just a pragmatic adoption of the most suitable ASCII characters as additional digits. -- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.300 / Virus Database: 266.2.0 - Release Date: 2/21/2005
A few non-answers but perhaps relevant: 1. When I entered computers, octal and binary were the only other bases, as I recall. No one in the early days seemed to need anything else. 2. Characters had 6-bit representation. When IBM invented bytes, it seemed wasteful. Who needed 64 character types? 3. There were systems that used base 10 for their internal representations. 4. An early suggestion for hex representation, which I think never got adopted, was something like T=ten, E=eleven, etc. (Don't ask me what they did for 12, etc.) 5. That would have been more mnemonic but messy in that coding the conversions would take a few extra steps. 6. Mathematica deals with arbitrary bases, and uses the alphabet in its normal order for bases > 10. 7. All suggestions that use symbols not on the standard keyboard are greatly handicapped. 8. The keyboard should have included a fourth type of parentheses, brackets, or braces, in preference to more symbols for number bases; computer algebra systems would benefit from this. 9. Useful higher bases are likely to be powers of 2 (or at least squares), so they can be represented by several shorter bit groups, as bytes are now shown by two hex characters. Steve Gray ----- Original Message ----- From: "David Wilson" <davidwwilson@comcast.net> To: "math-fun" <math-fun@mailman.xmission.com> Sent: Tuesday, February 22, 2005 10:53 AM Subject: [math-fun] Beyond base 10
I observe that the adoption of numerical symbols distinct from the writing symbols is a major cultural and mathematical advance. The main benefit is the ability to freely intermix words and numbers with little chance of misinterpretation. Clearly, a separate numerical system is superior to a system in which numbers must be represented by words, or in which letters double as numbers.
The Mayan number system, base 20, uses dots and dashes for digits. See http://www.michielb.nl/maya/math.html, for example. The scheme is clearly somewhat extensible to larger bases. When I was programming the LGP-30, the flexowriter I/O device provided hexadecimal, with the hexits being 0123456789fgjkqw. The letters used were pretty much what was left over after the instruction set. (The sixteen instructions were z b y r i d n m p e u t h c a s.) l and o weren't used for hexits as they could be confused with 1 and 0. (My memory is a bit hazy--it might even be that there was no numeral 1 on the keyboard.) I don't know why "v" was left out rather than "w". --ms Steve Gray wrote:
A few non-answers but perhaps relevant:
1. When I entered computers, octal and binary were the only other bases, as I recall. No one in the early days seemed to need anything else. 2. Characters had 6-bit representation. When IBM invented bytes, it seemed wasteful. Who needed 64 character types? 3. There were systems that used base 10 for their internal representations. 4. An early suggestion for hex representation, which I think never got adopted, was something like T=ten, E=eleven, etc. (Don't ask me what they did for 12, etc.) 5. That would have been more mnemonic but messy in that coding the conversions would take a few extra steps. 6. Mathematica deals with arbitrary bases, and uses the alphabet in its normal order for bases > 10. 7. All suggestions that use symbols not on the standard keyboard are greatly handicapped. 8. The keyboard should have included a fourth type of parentheses, brackets, or braces, in preference to more symbols for number bases; computer algebra systems would benefit from this. 9. Useful higher bases are likely to be powers of 2 (or at least squares), so they can be represented by several shorter bit groups, as bytes are now shown by two hex characters.
Steve Gray
----- Original Message ----- From: "David Wilson" <davidwwilson@comcast.net> To: "math-fun" <math-fun@mailman.xmission.com> Sent: Tuesday, February 22, 2005 10:53 AM Subject: [math-fun] Beyond base 10
I observe that the adoption of numerical symbols distinct from the writing symbols is a major cultural and mathematical advance. The main benefit is the ability to freely intermix words and numbers with little chance of misinterpretation. Clearly, a separate numerical system is superior to a system in which numbers must be represented by words, or in which letters double as numbers.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (3)
-
David Wilson -
Mike Speciner -
Steve Gray