An Introduction to Information Theory |
Other editions - View all
Common terms and phrases
A₁ alphabet applied associated assume average b₂ binary channel capacity Chap communication compute concept conditional entropy conditional probability consider corresponding coset decoding defined density distribution density function digits discrete elements encoded messages encoding procedure ensemble entropy equation equiprobable ergodic error probability example Find Fourier fundamental theorem given group code independent random variables inequality information theory input integral IRE Trans joint probability linear Markov Markov chain mathematical measure memoryless mutual information n-dimensional noise normal distribution occurs outcomes output P₁ probability density probability density function probability distribution probability matrix probability of error probability scheme problem proof sample space sampling theorem selected self-information sequence Shannon signal Solution specified stationary stationary process statistical stochastic process symbols tion transinformation transmission of information transmitted values vector words zero