Elements of Information Theory
Description:
Entropy, relative entropy and mutual information, the asymptotic equipartition property, entropy rates of a stochastic process, data compression, gambling and data compression, kolmogorov complexity, channel capacity, differential entropy, the gaussian channel, maximum entropy and spectral estimation, information theory and statistics, rate distortion theory, network information theory, information theory and the stock market, inequalities in information theory.
We're an Amazon Associate. We earn from qualifying purchases at Amazon and all stores listed here.