Abstract: In present communication, we have developed the
suitable constraints for the given the mean codeword length and the
measures of entropy. This development has proved that Renyi-s
entropy gives the minimum value of the log of the harmonic mean
and the log of power mean. We have also developed an important
relation between best 1:1 code and the uniquely decipherable code by
using different measures of entropy.
Abstract: The objective of the present communication is to
develop new genuine exponentiated mean codeword lengths and to
study deeply the problem of correspondence between well known
measures of entropy and mean codeword lengths. With the help of
some standard measures of entropy, we have illustrated such a
correspondence. In literature, we usually come across many
inequalities which are frequently used in information theory.
Keeping this idea in mind, we have developed such inequalities via
coding theory approach.