File Exchange

image thumbnail

Entropy_vs_Extensio​n

version 1.0.0.0 (826 Bytes) by Abdelrahman Marconi
Simulation of Entropy versus the theoretical and simulated average code length

0 Downloads

Updated 24 Mar 2014

View License

Simulation of Entropy versus the theoretical and simulated average code length ,This theorem provides another justification for the definition of entropy
rate—it is the expected number of bits per symbol required to describe
the process.
Finally, we ask what happens to the expected description length if the
code is designed for the wrong distribution. For example, the wrong distribution
may be the best estimate that we can make of the unknown true
distribution. We consider the Shannon code assignment l(x) =

log
designed for the probability mass function q(x). Suppose that the true
probability mass function is p(x). Thus, we will not achieve expected
length L ≈ H(p) =−

p(x) log p(x). We now show that the increase
in expected description length due to the incorrect distribution is the relative
entropy D(p||q). Thus, D(p||q) has a concrete interpretation as the
increase in descriptive complexity due to incorrect information.

Comments and Ratings (0)

MATLAB Release Compatibility
Created with R2013b
Compatible with any release
Platform Compatibility
Windows macOS Linux