Solved – What are the units of entropy of a normal distribution

entropyinformation theorymathematical-statisticsnormal distributionunits

I have a random process that follows a normal distribution. It's parameters are mean = 35 units, std.dev. = 8 units. I've seen from the wiki entry for the normal distribution that there is a formula to calculate the entropy. So plugging in the figures as:-
$$
.5\log\left(2\pi e^1 8\cdot 8\right)
$$

I get a value of 1.52, which I take to be per sample. My question is what are these units? What thing do I have 1.52 of?

Information entropy is (typically) measured in units of bits, after Claude Shannon's definition. So can I take it that each sample generates 1.52 bits of entropy? Clearly recording those samples generates information and therefore occupies a real and discrete amount of storage space. Ergo entropy cannot be unit less.

Best Answer

Shannon entropy is normally given "units" of bits or nats in information theory. Information theory includes the measurable concept of compression. Define a compression ratio as (ADC sample size) / (Shannon entropy of sample set). The numerator and denominator would both be described as "number of bits". The Shannon entropy of the sample set gives the smallest average number of bits per sample which could be achieved by entropy coding the sample set, such as using Huffman's approach. This context justifies applying the term "bits" to Shannon entropy. Note that the term entropy used in thermodynamics should not be confused with Shannon entropy used in information theory.

Related Question