Search NKS | Online
181 - 190 of 234 for Take
In the 1960s such ideas were increasingly formalized, particularly for execution times on Turing machines, and in 1965 the suggestion was made that one should consider computations feasible if they take times that grow like polynomials in their input size.
An example is the so-called busy beaver function (see page 1144 ) that gives the maximum number of steps that it takes for any Turing machine of size n to halt when started from a blank tape.
And in most cases such rules will not suffice even if one takes averages.
And what this would do is to take just a tiny region and make it large enough to correspond to everything we can now see in the universe. … But the crucial point is that this will not take long to happen throughout any network if it is appropriately connected.
Then—somewhat in analogy to retrieving closest memories—one can take a sequence of length n that one receives and find the codeword that differs from it in the fewest elements.
The fact that there are a million nerve fibers going from the eye to the brain, but only about 30,000 going from the ear to the brain means that while it takes several million bits per second to transmit video of acceptable quality, a few tens of thousands of bits are adequate for audio (NTSC television is 5 MHz; audio CDs 22 kHz; telephone 8 kHz).
For example, to a distant observer, an object falling into a black hole will seem to take an infinite time to cross the event horizon—even though to the object itself only a finite time will seem to have passed.
If a cellular automaton rule takes the new color of a cell with neighborhood configuration IntegerDigits[i, k, Length[os]] to be u 〚 i + 1 〛 , then one can define its rule number to be FromDigits[Reverse[u], k] .
In general, if the number varies like (1/a) d , one can take d to be the dimension of the pattern.
(There was confusion in the late 1980s when theoretical studies of self-organized criticality failed correctly to take squares in computing power spectra.)