Notes

Chapter 10: Processes of Perception and Analysis

Section 9: Statistical Analysis


Estimation of parameters [in probabilistic models]

One way to estimate parameters in simple probabilistic models is to compute the mean and other moments of the data and then to work out what values of the parameters will reproduce these. More general is the maximum likelihood method in which one finds the values of the parameters which maximize the probability of generating the observed data from the model. (Least squares fits do this for models in which the data exhibits independent Gaussian variations.) Various modifications can be made involving for example weighting with a risk function before maximizing. If one starts with a priori probability distributions for all parameters, then Bayes's Theorem on conditional probabilities allows one to avoid the arbitrariness of methods such as maximum likelihood and explicitly to work out from the observed data what the probability is for each possible choice of parameters in the model. It is rare in practice, however, to be able to get convincing a priori probability distributions, although when there are physical or other reasons to expect entropy to be maximized the so-called maximum entropy method may be useful.



Image Source Notebooks:

From Stephen Wolfram: A New Kind of Science [citation]