Entropy and Information Theory

Entropy and Information Theory image
ISBN-10:

3540973710

ISBN-13:

9783540973713

Author(s): Robert M Gray
Publisher: Springer, Verlag
Format: Hardcover, 0 pages
to view more data

Description:

This text is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and compromise several quantitative notions of the information in random variables, random processes and dynamical systems. Examples are entropy, mutual information and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behaviour of sample information and expected information.

Low Price Summary






Top Bookstores


























We're an Amazon Associate. We earn from qualifying purchases at Amazon and all stores listed here.

DISCLOSURE: We're an eBay Partner Network affiliate and we earn commissions from purchases you make on eBay via one of the links above.

Want a Better Price Offer?

Set a price alert and get notified when the book starts selling at your price.

Want to Report a Pricing Issue?

Let us know about the pricing issue you've noticed so that we can fix it.