A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-1935Springer Science & Business Media, 24 août 2008 - 225 pages This book offers a detailed history of parametric statistical inference. Covering the period between James Bernoulli and R.A. Fisher, it examines: binomial statistical inference; statistical inference by inverse probability; the central limit theorem and linear minimum variance estimation by Laplace and Gauss; error theory, skew distributions, correlation, sampling distributions; and the Fisherian Revolution. Lively biographical sketches of many of the main characters are featured throughout, including Laplace, Gauss, Edgeworth, Fisher, and Karl Pearson. Also examined are the roles played by DeMoivre, James Bernoulli, and Lagrange. |
Table des matières
BINOMIAL STATISTICAL INFERENCE | 9 |
De Moivres Normal Approximation to the Binomial | 16 |
STATISTICAL INFERENCE BY INVERSE PROBABIL | 30 |
5 | 41 |
7 | 55 |
8 | 62 |
The Multivariate Posterior Distribution | 67 |
Criticisms of Inverse Probability | 73 |
Normal Correlation and Regression | 131 |
73 | 137 |
Sampling Distributions Under Normality 18761908 149 | 148 |
75 | 154 |
Fishers Early Papers 19121921 | 159 |
76 | 170 |
The Revolutionary Paper 1922 175 | 174 |
Studentization the F Distribution and the Analysis | 185 |
THE CENTRAL LIMIT THEOREM AND LINEAR | 81 |
Gausss Theory of Linear Minimum Variance Estimation | 93 |
ERROR THEORY SKEW DISTRIBUTIONS | 102 |
The Development of a Frequentist Error Theory | 105 |
The Likelihood Function Ancillarity and Conditional | 193 |
77 | 204 |
217 | |
Autres éditions - Tout afficher
A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 ... Anders Hald Aucun aperçu disponible - 2010 |
A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713 ... Anders Hald Aucun aperçu disponible - 2006 |
Expressions et termes fréquents
analysis of variance ancillary statistics approximation arithmetic mean Assuming asymptotically normal Bayes Bernoulli binomial bivariate normal central limit theorem considers correlation coefficient corresponding degrees of freedom denote density derived direct probability discussion Edgeworth efficiency equals error distribution expansion finite follows formula frequency curve frequency function frequentist Galton Gauss Helmert Hence History of Statistics independent integral introduced inverse probability Karl Pearson Laplace Laplace's large number large-sample least squares likelihood function linear model mathematical statistics maximum likelihood estimate median method of least minimizing Moivre normal distribution normal with mean notation observations obtained orthogonal paper Poisson posterior distribution posterior mode probabilistic probable value problem proof proves R.A. Fisher random variables regression remarks Reprinted result sampling distribution skew solving standard deviation statistical inference Statistical Methods sum of squares symmetric theory Thiele transformation vectors writes