Elements of Information TheoryWiley, 26 août 1991 - 576 pages Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems. |
À l'intérieur du livre
63 pages contenant source coding dans ce livre
Page 540
Où puis-je trouver l'intégralité de ce livre ?
Résultats 1-3 sur 63
Table des matières
Introduction and Preview | 1 |
Entropy Relative Entropy and Mutual Information | 12 |
The Asymptotic Equipartition Property | 50 |
Droits d'auteur | |
23 autres sections non affichées
Autres éditions - Tout afficher
Expressions et termes fréquents
achievable algorithm alphabet average binary symmetric channel broadcast channel calculate capacity region channel capacity Chapter codebook codeword codeword lengths coding theorem conditional entropy Consider convex corresponding data compression defined Definition denote density describe differential entropy doubling rate encoding entropy rate equal ergodic example Fano's inequality feedback finite Gaussian channel given H(X₁ halt Hence Huffman code I(X₁ independent information theory input integer jointly typical Kolmogorov complexity Kraft inequality large numbers Lemma log p(x Markov chain matrix maximizes maximum entropy minimal multiple access channel mutual information node optimal code output P₁ P₂ probability mass function probability of error problem Proof prove R₁ R₂ random variable rate distortion function receiver relative entropy sample sender side information source coding stationary distribution stochastic process string sufficient statistic symbols symmetric channel tree typical set uniquely decodable upper bound X₁ X₂ Y₁ Y₂ Σ Σ