Mathematical Foundations of Information TheoryCourier Corporation, 1 janv. 1957 - 120 pages The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition. |
Autres éditions - Tout afficher
Mathematical Foundations of Information Theory Aleksandr I︠A︡kovlevich Khinchin Affichage d'extraits - 1957 |
Mathematical Foundations of Information Theory Aleksandr I︠A︡kovlevich Khinchin Affichage d'extraits - 1957 |
Mathematical Foundations of Information Theory Aleksandr I?Akovlevich Khinchin Aucun aperçu disponible - 1957 |
Expressions et termes fréquents
alphabet A0 amount of information arbitrarily small channel input channel output concept conditional entropy conditional mathematical expectation conditional probability consider cylinder defined definition denote the set depends distinguishable group elementary events emitted ergodic source exceed Feinstein find finite memory finite scheme finite spaces fixed follows fundamental lemma given channel given sequence given source high probability group inequality information given information theory input alphabet large number law of large Lebesgue integral Let us agree Let us denote low probability group martingale mathematical expectation McMillan’s theorem means n-term sequences number of different number of letters number of sequences obtain obviously positive integers possible probability measure probability space probability theory proof proved quantity random variable reflects regarded right side satisfied sequence 00 sequence of letters sequence wC sequences of length Shannon theorem signals special group specific stationary source sufliciently large symbols uncertainty uniquely determined wC W1