Search NKS | Online
51 - 60 of 95 for Log
But it is straightforward to define versions of entropy that take account of probabilities—and indeed the closest analog to the usual entropy in physics or information theory is obtained by taking the probabilities p[i] for the k n blocks of length n (assuming k colors), then constructing
-Limit[Sum[p[i] Log[k, p[i]], {i, k n }]/n, n ∞ ]
I have tended to call this quantity measure entropy, though in other contexts, it is often just called entropy or information, and is sometimes called information dimension. … An example of a generalization is the quantity given for blocks of size n by
h[q_, n_]:= Log[k, Sum[p[i] q , {i, k n }]]/(n(q - 1)
where q = 0 yields set entropy, the limit q 1 measure entropy, and q = 2 so-called correlation entropy.
One can define a topological spacetime entropy h tx as
Limit[Limit[Log[k, s[t, x]]/t , t ∞ ], x ∞ ]
and a measure spacetime entropy h μ tx by replacing s with p Log[p] .
Intrinsically defined curves
With curvature given by a function f[s] of the arc length s , explicit coordinates {x[s], y[s]} of points are obtained from (compare page 1048 )
NDSolve[{x'[s] Cos[ θ [s]], y'[s] Sin[ θ [s]], θ '[s] f[s], x[0] y[0] θ [0] 0}, {x, y, θ }, {s, 0, s max }]
For various choices of f[s] , formulas for {x[s], y[s]} can be found using DSolve :
f[s] = 1: {Sin[ θ ], Cos[ θ ]}
f[s] = s: {FresnelS[ θ ], FresnelC[ θ ]}
f[s] = 1/ √ s : √ θ {Sin[ √ θ ], Cos[ √ θ ]}
f[s] = 1/s: θ {Cos[Log[ θ ]], Sin[Log[ θ ]]}
f[s] = 1/s 2 : θ {Sin[1/ θ ], Cos[1/ θ ]}
f[s] = s n : result involves Gamma[1/n, ± θ n/n ]
f[s] = Sin[s] : result involves Integrate[Sin[Sin[ θ ]], θ ] , expressible in terms of generalized Kampé de Fériet hypergeometric functions of two variables.
(A potential way around this is to use the theory of unbiased estimators for polynomials just above and below p Log[p] .)
It is related to (a) by Gray code reordering of the rows, and to (b) by reordering according to (see page 905 )
BitReverseOrder[a_] := With[{n = Length[a]}, a 〚 Map[FromDigits[Reverse[#], 2] &, IntegerDigits[Range[0, n - 1], 2, Log[2, n]]] + 1 〛 ]
It is also given by
Array[Apply[Times, (-1)^(IntegerDigits[#1, 2, s] Reverse[IntegerDigits[#2, 2, s]])] &, 2^{s,s}, 0]
where (b) is obtained simply by dropping the Reverse .
… However, the nested structure of m in natural order allows evaluation in only about n Log[n] steps using
Nest[Flatten[Transpose[Partition[#, 2] . {{1, 1}, {1, -1}}]] &, data, Log[2, Length[data]]]
This procedure is similar to the fast Fourier transform discussed below.
The result turns out to be given by 2 IntegerExponent[x + 1, 2] + 3 , which has a maximum of 2n+3 , where n is the length of the digit sequence of x , or Floor[Log[2, x]] .
2 : BesselI[0, 2] - 1 ; n 2 n : Log[2] ; n 2 : π 2 /6 ; (3n - 1)(3n - 2): π √ 3 /9 ; 3 - 16n + 16n 2 : π /8 ; n n!
(The peaks are known to grow roughly like n 2/27 Log[2, n] 2 —intermediate between polynomial and exponential.)
(The solution of the so-called dimer problem in 1961 also showed that for complete coverings of a square grid by 2-cell dominoes h = Catalan/( π Log[2]) ≃ 0.421 .)
The sets of numbers that can be obtained by applying elementary functions like Exp , Log and Sin seem in various ways to be disjoint from algebraic numbers. … For rational functions f[x] , Integrate[f[x], {x, 0, 1}] must always be a linear function of Log and ArcTan applied to algebraic numbers ( f[x] = 1/(1 + x 2 ) for example yields π /4 ).