Growth rates [of functions]
One can characterize most functions by their ultimate rates of growth. In basic mathematics these might be n, 2n, 3n, ... or n2, n3, ..., or 2n, 3n, ..., or 2n, 22n, 22n, ... To go further one begins by defining an analog to the Ackermann function of page 906:
[1][n_] = 2n; [s_][n_] := Nest[[s - 1], 1, n]
[2][n] is then 2n, [3] is iterated power, and so on. Given this one can now form the "diagonal" function
[ω][n_]:= [n][n]
and this has a higher growth rate than any of the [s][n] with finite s. This higher growth rate is indicated by the transfinite index ω. And in direct analogy to the transfinite numbers discussed above one can then in principle form a hierarchy of functions using operations like
[ω + s][n_]:=Nest[[ω + s - 1], 1, n]
together with diagonalization at limit ordinals. In practice, however, it gets more and more difficult to determine that the functions defined in this way actually in a sense halt and yield definite values—and indeed for [ε0] this can no longer be proved using the ordinary axioms of arithmetic (see below). Yet it is still possible to define functions with even more rapid rates of growth. An example is the so-called busy beaver function (see page 1144) that gives the maximum number of steps that it takes for any Turing machine of size n to halt when started from a blank tape. In general this function must grow faster than any computable function, and is not itself computable.