An Introduction to Information TheoryMcGraw-Hill, 1961 - 496 pages |
Table des matières
PREFACE | 1 |
Basic Concepts of Probability | 19 |
19 | 71 |
Droits d'auteur | |
27 autres sections non affichées
Autres éditions - Tout afficher
Expressions et termes fréquents
A₁ alphabet applied associated assume average b₂ binary channel capacity Chap communication compute concept conditional entropy conditional probability consider corresponding coset decoding defined density distribution density function digits discrete elements encoded messages encoding procedure ensemble entropy equation equiprobable ergodic error probability example Find Fourier fundamental theorem given group code independent random variables inequality information theory input integral IRE Trans joint probability linear Markov Markov chain mathematical measure memoryless mutual information n-dimensional noise normal distribution occurs outcomes output P₁ probability density probability density function probability distribution probability matrix probability of error probability scheme problem proof sample space sampling theorem selected self-information sequence Shannon signal Solution specified stationary stationary process statistical stochastic process symbols tion transinformation transmission of information transmitted values vector words zero