The more general case [of computation speed ups]
One can think of a single step in the evolution of any system as taking a rule r and state s, and producing a new state h[r, s]. Usually the representations that are used for r and s will be quite different, and the function h will have no special properties. But for both multiplication rules and additive cellular automata it turns out that rules and states can be represented in the same way, and the evolution functions h have the property of being associative, so that h[a, h[b, c]] h[h[a, b], c]. This means that in effect one can always choose to evolve the rule rather than a state. A consequence is that for example 4 steps of evolution can be computed not only as h[r, h[r, h[r, h[r, s]]]] but also as h[h[h[r, r], h[r, r]], s] or u = h[r, r]; h[h[u, u], s]—which requires only 3 applications of h. And in general if h is associative the result Nest[h[r, #]&, s, t] of t steps of evolution can be rewritten for example using the repeated squaring method as
h[Fold[If[#2 0, h[#1, #1], h[r, h[#1, #1]]] &, r, Rest[IntegerDigits[t, 2]]], s]
which requires only about Log[t] rather than t applications of h.
As a very simple example, consider a system which starts with the integer 1, then at each step just adds 1. One can compute the result of 9 steps of evolution as 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1, but a better scheme is to use partial results and compute successively 1 + 1; 2 + 2; 1 + 4; 5 + 5—which is what the repeated squaring method above does when h = Plus, r = s = 1. This same basic scheme can be used with any associative function h—Max, GCD, And, Dot, Join or whatever—so long as suitable forms for r and s are used.
For the multiplication rules discussed in the main text both states and rules can immediately be represented by integers, with h = Times, and r = m giving the multiplier. For additive cellular automata, states and rules can be represented as polynomials (see page 951), with h[a_, b_] := PolynomialMod[a b, k] and for example r = (1 + x) for elementary rule 60. The correspondence between multiplication rules and additive cellular automata can be seen even more directly if one represents all states by integers and computes h in terms of base k digits. In both cases it then turns out that h can be obtained from (see note above)
h[a_, b_] := FromDigits[g[ListConvolve[IntegerDigits[a, k], IntegerDigits[b, k], {1, -1}, 0]], k]
where for multiplication rules g = Identity and for additive cellular automata g = Mod[#, k] &. For multiplication rules, there are normally carries (handled by FromDigits), but for power cellular automata, these have only limited range, so that g = Mod[#, kα] & can be used.
For any associative function h the repeated squaring method allows the result of t steps of evolution to be computed with only about Log[t] applications of h. But to be able to do this some of the arguments given to h inevitably need to be larger. So whether a speedup is in the end achieved will depend on how fast h can be evaluated for arguments of various sizes. Typically the issue is whether h[a, b] for large a and b can be found with much less effort than it would take to evaluate h[r, b] about a times. If h = Times, then as discussed in the note above, the most obvious procedure for evaluating h[a, b] would involve about m n operations, where m and n are the numbers of digits in a and b. But when m ≃ n FFT-related methods allow this to be reduced to about n Log[n] operations. And in fact whenever h is commutative (Orderless) it turns out that such methods can be used, and substantial speedups obtained. But whether anything like this will work in other cases is not clear.
(See also page 886.)