An Introduction to Information Theory
Courier Dover Publications, 1994 - 496 pages
Written for an engineering audience, this book has a threefold purpose: (1) to present elements of modern probability theory (discrete, continuous, and stochastic) (2) to present elements of information theory with emphasis on its basic roots in probability theory (3) to present elements of coding theory.
The emphasis throughout the book is on such basic concepts as sets, the probability measure associated with sets, sample space, random variables, information measure, and capacity. These concepts proceed from set theory to probability theory and then to information and coding theories. No formal prerequisites are required other than the usual undergraduate mathematics included in an engineering or science program. However, since these programs may not include a course in probability, the author presents an introductory treatment of probability for those who wish to pursue the general study of statistical theory of communications.
The book is divided into four parts: memoryless discrete themes, memoryless continuum, schemes with memory and an outline of some recent developments. An appendix contains notes to help familiarize the reader with the literature in the field, while the inclusion of many reference tables and an extensive bibliography with some 200 entries makes this an excellent resource for any student in the field.
Avis des internautes - Rédiger un commentaire
Aucun commentaire n'a été trouvé aux emplacements habituels.
alphabet applied associated assume average band-limited binary channel capacity Chap communication compute concept conditional entropy conditional probability consider corresponding coset decoding defined definition density distribution density function derived digits discrete discrete random variable elements encoded messages encoding procedure ensemble entropy equation equiprobable ergodic error probability example Find Fourier fundamental theorem given group code independent random variables inequality information theory input integral IRE Trans joint probability letters linear Markov Markov chain Math mathematical mutual information n-dimensional noise normal distribution Note obtained outcomes output parity checks probability density probability density function probability distribution probability matrix probability of error probability scheme problems proof received referred sample space sampling theorem selected sequence Shannon signal Solution specified standard deviation stationary process statistical stochastic process subsets symbols tion transformation transinformation transmission of information transmitted values vector zero