9997345

Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.

[1] L. L. Campbell, "Exponential entropy as a measure of extent of distribution,” Z. Wahrscheinlichkeitstheorie verw. Geb, vol. 5, pp. 217-225, 1966.

[2] J. H. Havrda and F. Charvat, "Quantification methods of classification processes: concept of structural α entropy,” , Kybernetica, vol. 3, pp. 30-35, 1967.

[3] S. W. Golomb,, "The information generating function of a probability distribution,” IEEE Transition on Information Theory, vol. 12, pp. 75-77, 1966.

[4] J. N. Kapur, Measure of information and their applications,1st ed., New Delhi, Wiley Eastern Limited, 1994.

[5] T. Koski and L. E. Persson, "Some properties of generalized exponential entropies with applications to data compression,” Information Sciences, vol. 62, pp. 103-132, 1992.

[6] S. Nadarajah and K. Zografos, "Formulas for R`enyi information and related measures for univariate distributions,” Information Sciences, vol. 155, pp. 119-138, 2003.

[7] F. Nielsen and R. Nock, "On R`enyi and Tsallis entropies and divergences for exponential families,”, 2011. http://arxiv.org/abs/1105.3259v1.

[8] A. R`enyi, "On measures of entropy and information,” in proceeding of the Forth Berkeley Symposium on Mathematics, Statistics and Probability-1961, pp. 547-561, 1961.

[9] C. E. Shannon, "A mathematical theory of communication,” Bell System Technical Journal, pp. 379-423;623-656, 1948.

[10] B. D. Sharma, I. J. Taneja, "Three generalized-additive measures of entropy,” E.I.K. (Germany), vol. 13, pp. 419-433, 1977.