Elements of Information TheoryWiley, 26 août 1991 - 542 pages Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems. |
À l'intérieur du livre
Essayez d'effectuer cette recherche dans tous les volumes : information theory
Résultats 1-0 sur 0
Table des matières
Entropy Relative Entropy and Mutual Information | 10 |
The Asymptotic Equipartition Property 50 | 48 |
Entropy Rates of a Stochastic Process | 60 |
Droits d'auteur | |
21 autres sections non affichées
Autres éditions - Tout afficher
Expressions et termes fréquents
achievable algorithm average binary symmetric channel bits broadcast channel calculate capacity region channel capacity Chapter codebook codeword codeword lengths coding theorem conditional consider constraint convex corresponding data compression defined Definition denote describe differential entropy doubling rate drawn i.i.d. encoding entropy rate equal ergodic example Fano's inequality feedback finite Gaussian channel halt Hence Huffman code I(X₁ IEEE Trans independent information theory input joint distribution jointly typical Kolmogorov complexity Kraft inequality large numbers law of large Lemma log p(x lower bound Markov chain matrix maximizes maximum entropy minimize multiple access channel mutual information node optimal output P₁ P₂ probability mass function probability of error problem proof prove R₁ R₂ random variable rate distortion function receiver relative entropy satisfies sender Shannon side information source coding stationary string symbols typical sequences typical set uniquely decodable upper bound W₂ X₁ X₂ Y₁ Y₂