See my next post about how the hidden bit part works. You could do similar things w/o the hidden bit, but you might have to separate the even & odd exponent cases. Alternatively, you might have to do a tad more ANDing & ORing & shifting to move a couple of bits around. To some extent, the whole thing is becoming academic, since modern computers are implementing small HW roms that implement at least 8 bits of sqrt and 8 bits of 1/sqrt. So you can get the 8-bit initial approximations for "free". At 09:17 AM 4/19/2011, Marc LeBrun wrote:
="Henry Baker" <hbaker1@pipeline.com> When making an initial guess of sqrt(x) for a Newton iteration, where x is a positive floating point number, a good initial guess is the bits of the entire number (both exponent & mantissa) shifted right by 1.
="Henry Baker" <hbaker1@pipeline.com> Therefore, the IBM & early DEC series machines need not apply.
The direction this thread is taking here mystifies me. What is the benefit in imposing the latter restriction?
It would seem the more interesting scope is the most inclusive: The SQRT Hack may appear in ANY architecture that implements flonums with structured bit strings, so why not consider all candidates?
The defining character of The SQRT Hack, making it worthy of canonization in programming lore, is nicely given by the first sentence. Actually, I'd go further and simplify it by removing the references to Newton and iteration:
"For a positive flonum X, a good guess for a flonum approximating sqrt(X) is to shift ALL the bits of X (BOTH exponent and mantissa!) one place right."
However put, it rises to the level of a Canonical Hack by transgressing a banal view of X as implicitly merely a "strongly typed" pair X=(M,E), and instead operating on X's underlying implementation as a concatenated bit string, X=<E:M>.
Though it seems to horrify pedants, exposure to this level of hackery ought to be included in the early indoctrination of computer science novitiates, because good design of data representations in computing plays a role analogous to artfully choosing notations in mathematics. It is part of "The Art" at the heart of "The Art of Computer Programming".
Deprecating the concrete perspective weakens computer science, biasing the subject towards the study of idealizations of how we might wish computing devices to be, at the expense of engaging with them as they actually are.
There is material for a valuable and fascinating survey article here (wow, Konrad Zuse?!) that I hope someone gets motivated to research and write up!