Precision (statistics)

Common statistical usage defines precision as the reciprocal of the variance, and the precision matrix as the matrix inverse of the covariance matrix.[1] Some particular statistical models define the term precision differently.

One particular use of the precision matrix is in the context of Bayesian analysis of the multivariate normal distribution: for example, Bernardo & Smith[2] prefer to parameterise the multivariate normal distribution in terms of the precision matrix, rather than the covariance matrix, because of certain simplifications that then arise.

In general, statisticians prefer to use the dual term variability rather than precision. Variability is the amount of imprecision.

History

The term precision in this sense (“mensura praecisionis observationum”) first appeared in the works of Gauss (1809) “Theoria motus corporum coelestium in sectionibus conicis solem ambientium” (page 212). Gauss’s definition differs from the modern one by a factor of . He writes, for the density function of a normal random variable with precision h,

Later Whittaker & Robinson (1924) “Calculus of observations” called this quantity the modulus, but this term has dropped out of use.[3]

Definition

In common statistical usage:

References

  1. 1 2 Dodge Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
  2. Bernardo, J. M. & Smith, A.F.M. (2000) Bayesian Theory, Wiley ISBN 0-471-49464-X
  3. "Earliest known uses of some of the words in mathematics".
This article is issued from Wikipedia - version of the 8/21/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.