A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-1935

Couverture
Springer Science & Business Media, 24 août 2008 - 225 pages

This book offers a detailed history of parametric statistical inference. Covering the period between James Bernoulli and R.A. Fisher, it examines: binomial statistical inference; statistical inference by inverse probability; the central limit theorem and linear minimum variance estimation by Laplace and Gauss; error theory, skew distributions, correlation, sampling distributions; and the Fisherian Revolution. Lively biographical sketches of many of the main characters are featured throughout, including Laplace, Gauss, Edgeworth, Fisher, and Karl Pearson. Also examined are the roles played by DeMoivre, James Bernoulli, and Lagrange.

 

Avis des internautes - Rédiger un commentaire

Aucun commentaire n'a été trouvé aux emplacements habituels.

Table des matières

The Three Revolutions in Parametric Statistical Inference
1
Laplace 17741786
2
Gauss and Laplace 18091828
4
RA Fisher 19121956
6
BINOMIAL STATISTICAL INFERENCE The Three Pioneers Bernoulli 1713 de Moivre 1733 and Bayes 1764
9
James Bernoullis Law of Large Numbers for the Binomial 1713 and Its Generalization
11
22 Remarks on Further Developments
14
De Moivres Normal Approximation to the Binomial 1733 and Its Generalization
16
122 Linear Minimum Variance Estimation 1811 and 1812
85
123 Asymptotic Relative Eciency of Estimates 1818
88
124 Generalizations of the Central Limit Theorem
90
Gausss Theory of Linear Minimum Variance Estimation
93
132 Estimation Under Linear Constraints 1828
96
133 A Review of Justifications for the Method of Least Squares
97
134 The State of Estimation Theory About 1830
98
ERROR THEORY SKEW DISTRIBUTIONS CORRELATION SAMPLING DISTRIBUTIONS
102

32 Lagranges Multivariate Normal Approximation to the Multinomial and His Confidence Interval for the Binomial Parameter 1776
22
33 De Morgans Continuity Correction 1838
24
Bayess Posterior Distribution of the Binomial Parameter and His Rule for Inductive Inference 1764
25
42 Bayess Rule for Inductive Inference 1764
27
STATISTICAL INFERENCE BY INVERSE PROBABILITY Inverse Probability from Laplace 1774 and Gauss 1809 to Edgeworth 1909
30
Laplaces Theory of Inverse Probability 17741786
31
52 The Principle of Inverse Probability and the Symmetry of Direct and Inverse Probability 1774
35
53 Posterior Consistency and Asymptotic Normality in the Binomial Case 1774
38
54 The Predictive Distribution 17741786
40
55 A Statistical Model and a Method of Estimation The Double Exponential Distribution 1774
41
56 The Asymptotic Normality of Posterior Distributions 1785
44
A Nonprobabilistic Interlude The Fitting of Equations to Data 17501805
47
62 The Method of Averages by Mayer 1750 and Laplace 1788
48
63 The Method of Least Absolute Deviations by Boscovich 1757 and Laplace 1799
50
64 The Method of Least Squares by Legendre 1805
52
Gausss Derivation of the Normal Distribution and the Method of Least Squares 1809
55
72 Gausss Derivation of the Normal Distribution 1809
57
73 Gausss First Proof of the Method of Least Squares 1809
58
74 Laplaces LargeSample Justification of the Method of Least Squares 1810
60
Credibility and Confidence Intervals by Laplace and Gauss
62
82 Laplaces General Method for Constructing LargeSample Credibility and Confidence Intervals 1785 and 1812
64
84 Gausss Rule for Transformation of Estimates and Its Implication for the Principle of Inverse Probability 1816
65
85 Gausss Shortest Confidence Interval for the Standard Deviation of the Normal Distribution 1816
66
The Multivariate Posterior Distribution
67
92 Pearson and Filons Derivation of the Multivariate Posterior Distribution 1898
68
Edgeworths Genuine Inverse Method and the Equivalence of Inverse and Direct Probability in Large Samples 1908 and 1909
69
103 Edgeworths Genuine Inverse Method 1908 and 1909
71
Criticisms of Inverse Probability
73
112 Poisson
75
113 Cournot
76
114 Ellis Boole and Venn
77
115 Bing and von Kries
79
116 Edgeworth and Fisher
80
THE CENTRAL LIMIT THEOREM AND LINEAR MINIMUM VARIANCE ESTIMATION BY LAPLACE AND GAUSS
81
Laplaces Central Limit Theorem and Linear Minimum Variance Estimation
83
The Development of a Frequentist Error Theory
105
142 Hagens Hypothesis of Elementary Errors and His Maximum Likelihood Argument 1837
106
143 Frequentist Theory Chauvenet 1863 and Merriman 1884
108
Skew Distributions and the Method of Moments
110
152 Series Expansions of Frequency Functions The A and B Series
112
153 Biography of Karl Pearson
117
154 Pearsons FourParameter System of Continuous Distributions 1895
120
155 Pearsons x² Test for Goodness of Fit 1900
123
156 The Asymptotic Distribution of the Moments by Sheppard 1899
125
157 Kapteyns Derivation of Skew Distributions 1903
126
Normal Correlation and Regression
131
162 Galtons Empirical Investigations of Regression and Correlation 18691890
134
163 The Mathematization of Galtons Ideas by Edgeworth Pearson and Yule
141
164 Orthogonal Regression The Orthogonalization of the Linear Model
146
Sampling Distributions Under Normality 18761908
148
173 Pizzettis Orthonormal Decomposition of the Sum of Squared Errors in the LinearNormal Model 1892
153
174 Students t Distribution by Gosset 1908
154
THE FISHERIAN REVOLUTION 19121935
157
Fishers Early Papers 19121921
159
182 Fishers Absolute Criterion 1912
163
183 The Distribution of the Correlation Coecient 1915 Its Transform 1921 with Remarks on Later Results on Partial and Multiple Correlation
165
184 The Sufficiency of the Sample Variance 1920
172
The Revolutionary Paper 1922
174
192 Properties of the Maximum Likelihood Estimate
178
193 The TwoStage Maximum Likelihood Method and Unbiasedness
182
Studentization the F Distribution and the Analysis of Variance 19221925
185
202 The F Distribution
187
203 The Analysis of Variance
188
The Likelihood Function Ancillarity and Conditional Inference
193
212 Ancillarity and Conditional Inference
194
214 The Likelihood Function
195
References
198
Subject Index
217
Author Index
221
Droits d'auteur

Autres éditions - Tout afficher

Expressions et termes fréquents

Références à ce livre

À propos de l'auteur (2008)

ANDERS HALD, now retired, was Professor of Statistics at the University of Copenhagen from 1948 to 1982. His previous books include A History of Probability and Statistics and Their Applications before 1750, the companion volume to this history. He is an honorary Fellow of the Royal Statistical Society of London, the Danish Statistical Society, and the Danish Society for Industrial Quality Control.

Informations bibliographiques