Weighing the Odds: A Course in Probability and Statistics

Couverture
Cambridge University Press, 2 août 2001 - 547 pages
Statistics do not lie, nor is probability paradoxical. You just have to have the right intuition. In this lively look at both subjects, David Williams convinces mathematics students of the intrinsic interest of statistics and probability, and statistics students that the language of mathematics can bring real insight and clarity to their subject. He helps students build the intuition needed, in a presentation enriched with examples drawn from all manner of applications, e.g., genetics, filtering, the Black–Scholes option-pricing formula, quantum probability and computing, and classical and modern statistical models. Statistics chapters present both the Frequentist and Bayesian approaches, emphasising Confidence Intervals rather than Hypothesis Test, and include Gibbs-sampling techniques for the practical implementation of Bayesian methods. A central chapter gives the theory of Linear Regression and ANOVA, and explains how MCMC methods allow greater flexibility in modelling. C or WinBUGS code is provided for computational examples and simulations. Many exercises are included; hints or solutions are often provided.
 

Avis des internautes - Rédiger un commentaire

Aucun commentaire n'a été trouvé aux emplacements habituels.

Pages sélectionnées

Table des matières

Introduction
1
12 Sharpening our intuition
18
13 Probability as Pure Maths
23
14 Probability as Applied Maths
25
15 First remarks on Statistics
26
16 Use of Computers
31
Events and Probabilities
35
21 Possible outcome actual outcome and Events
36
73 Joint pdfs transformations
246
74 Conditional pdfs
258
75 Multiparameter Bayesian Statistics
265
Linear Models ANOVA etc
283
82 The Orthonormality Principle and the Ftest
295
the Mathematics
304
84 Goodness of fit robustness hierarchical models
339
85 Multivariate Normal MVN Distributions
365

22 Probabilities
39
23 Probability and Measure
42
3 Random Variables Means and Variances
47
32 DFs pmfs and pdfs
50
33 Means in the case when 12 is finite
56
34 Means in general
59
35 Variances and Covariances
65
Conditioning and Independence
73
42 Independence
96
43 Laws of large numbers
103
a first look
116
45 A simple strong Markov principle
127
46 Simulation of IID sequences
130
Generating Functions and the Central Limit Theorem
141
51 General comments on the use of Generating Functions GFs
142
52 Probability generating functions pgfs
143
53 Moment Generating Functions MGFs
146
54 The Central Limit Theorem CLT
156
55 Characteristic Functions CFs
166
Confidence Intervals for oneparameter models
169
62 Some commonsense Frequentist CIs
173
63 Likelihood sufficiency exponential family
181
64 Brief notes on Point Estimation
187
associated CIs
192
66 Bayesian Confidence Intervals
199
67 Hypothesis Testing if you must
222
Conditional pdfs and multiparameter Bayesian Statistics
240
72 Jacobians
243
Some further Probability
383
91 Conditional Expectation
385
92 Martingales
406
93 Poisson Processes PPs
426
Quantum Probability and Quantum Computing
440
a first look
441
102 Foundations of Quantum Probability
448
a closer look
463
104 Spin and Entanglement
472
105 Spin and the Dirac equation
485
106 Epilogue
491
Some Prerequisites and Addenda
495
Appendix A3 o notation
496
Appendix A5 The Axiom of Choice
498
Appendix A6 A nonBorel subset of 01
499
Appendix A8 A nonuniqueness example for moments
500
Appendix A9 Proof of a TwoEnvelopes result
501
Discussion of some Selected Exercises
502
Tables
514
Table of the normal distribution function
515
Upper percentage points for t
516
Upper percentage points for x²
517
Upper 5 percentage points for F
518
A small Sample of the Literature
519
Bibliography
525
Index
539
Droits d'auteur

Autres éditions - Tout afficher

Expressions et termes fréquents

Fréquemment cités

Page 530 - D 1997 Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference. Chapman and Hall, London Gílfand A E.

Informations bibliographiques