At 08:42 AM 8/3/2012, Adam P. Goucher wrote:
TECO was essentially a 2-stack implementation, where the "cursor" was simultaneously the top of the left stack and the top of the right stack.
Hmm, we can thus think of the two stacks as being:
C, L1, L2, L3, L4, L5, L6, ..., Ln
and
C, R1, R2, R3, R4, R5, R6, ..., Rm.
We can then proceed to identify the 'cursor' in each stack to obtain the following:
Ln, ..., L3, L2, L1, C, R1, R2, R3, ..., Rm
This seems to be equivalent to an ETM (extensible Turing machine), where the machine can insert and remove cells in the tape as well as doing the ordinary operations.
Yes, indeed. Although my version would also allow cyclic permutations on the top portions of each stack. I conceive of a machine where the "cache" has the ability to shift items up/down in one step. This can be accomplished either for real, or "virtually", where the cache is addressed differently. Think of a city (Manhattan?) with 2 one-way streets (Madison & Fifth?) & cross-streets. Any traffic on its way back to memory (goes downtown) can be easily shunted via the cross streets into the uptown traffic. The ALU reads directly from this cache & stores directly back to this cache; there is no direct path to memory. (The ALU is off to the side (or above?) these streets & can read & write to/from these streets.) The ALU is at the center of the "collision" between these two stacks. (I seem to recall a paper from about 1984 which sketched out a stack machine similar to this idea, but I haven't been able to remember the author or the location of the paper.) Even sophisticated operations can be implemented on the "top" of this stack -- e.g., a 32 or 64 element FFT that reads & writes the top 32 or 64 elements of the "stack". (I.e., a 32- or 64-element FFT is a single combinator.) The cache management algorithm is a "stack" algorithm, which is pretty darn good.