<< Chapter < Page | Chapter >> Page > |
Consider the entropy of continuous random variables. Whereas the (normal) entropy is the entropy of a discrete random variable, the differential entropy is the entropy of a continuous random variable.
Now, consider a calculating the differential entropy of some random variables.
Consider a uniformly distributed random variable from to . Then its density is from to , and zero otherwise.
We can then find its differential entropy as follows,
Consider a normal distributed random variable , with mean and variance . Then its density is .
We can then find its differential entropy as follows, first calculate :
In the section we list some properties of the differential entropy.
Notification Switch
Would you like to follow the 'Information and signal theory' conversation and receive update notifications?