Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Description:
Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems.
Want a Better Price Offer?
Set a price alert and get notified when the book starts selling at your price.
Want to Report a Pricing Issue?
Let us know about the pricing issue you've noticed so that we can fix it.