Elements of Information Theory
Description:
Entropy, relative entropy and mutual information, the asymptotic equipartition property, entropy rates of a stochastic process, data compression, gambling and data compression, kolmogorov complexity, channel capacity, differential entropy, the gaussian channel, maximum entropy and spectral estimation, information theory and statistics, rate distortion theory, network information theory, information theory and the stock market, inequalities in information theory.
We're an Amazon Associate. We earn from qualifying purchases at Amazon and all stores listed here.
Want a Better Price Offer?
Set a price alert and get notified when the book starts selling at your price.
Want to Report a Pricing Issue?
Let us know about the pricing issue you've noticed so that we can fix it.