Search NKS | Online
1 - 9 of 9 for FindMinimum
Finding layouts [for networks]
One way to lay out a network g so that network distances in it come as close as possible to ordinary distances in d -dimensional space, is just to search for values of the x[i, k] which minimize a quantity such as
With[{n = Length[g]}, Apply[Plus, Flatten[(Table[Distance[g, {i, j}], {i, n}, {j, n}] 2 - Table[ Sum[(x[i, k] - x[j, k]) 2 , {k, d}], {i, n}, {j, n}]) 2 ]]]
using for example FindMinimum starting say with x[1, _] 0 and all the other x[_, _] Random[] . Rarely is there a unique minimum that can be found, but the approach nevertheless seems to work fairly well whenever a good layout exists in a particular number of dimensions.
If one knew the fundamental rules for the universe then one way in principle to define the amount of computation associated with a given process would be to find the minimum number of applications of the rules for the universe that are needed to reproduce the process at some level of description.
And as it turns out, this kind of behavior is not uncommon among iterative procedures; indeed it is even seen in such simple cases as trying to find the lowest point on a curve. … But in the other two cases, this procedure will usually end up getting stuck at a local minimum. This is the basic phenomenon which makes it difficult to find patterns that satisfy constraints exactly using a procedure that is based on progressive improvement.
Since the late 1800s there have been efforts to find schemes that require the absolute minimum number of steps.
The procedure that is used to lay out the networks on the previous two pages [ 492 , 493 ] is a direct analog of the procedure used for space networks on page 479 : the row in which a particular node will be placed is determined by the minimum number of connections that have to be followed in order to reach that node starting from the node at the top.
… Like in so many other systems that we have studied in this book, the randomness that we find in causal networks will inevitably tend to wash out details of how the networks are constructed.
Maximal block compression
If one has data that consists of a long sequence of blocks, each of length b , and each independently chosen with probability p[i] to be of type i , then as argued by Claude Shannon in the late 1940s, it turns out that the minimum number of base 2 bits needed on average to represent each block in such a sequence is h = -Sum[p[i] Log[2, p[i]], {i, 2 b }] . … With this assumption one then finds that maximal compression occurs if a block of probability p[i] is represented by a codeword of length -Log[2, p[i]] .
And from this discussion there emerges the following interpretation of the Mandelbrot set that appears not to be well known but which I find most illuminating. … The pictures below show a generalization of this idea, in which gray level indicates the minimum distance Abs[z - z 0 ] of any point z in the Julia set from a fixed point z 0 .
For a single molecule, the minimum energy configuration can presumably always be found by a limited amount of computational work—though potentially increasing rapidly with the number of atoms. … Yet ever since the 1960s there have been computer systems like LHASA that try to find synthesis pathways automatically.
What emerged as most popular is topological dimension, in which one fills space with overlapping balls, and asks what the minimum number that ever have to overlap at any point will be. … For example, to find a definite volume growth rate one does still need to take some kind of limit—and one needs to avoid sampling too many or too few nodes in the network.