Научная статья на тему 'MATHEMATICAL MODELING OF AVERAGE WEIGHTED RENYI’S ENTROPY MEASURE'

MATHEMATICAL MODELING OF AVERAGE WEIGHTED RENYI’S ENTROPY MEASURE Текст научной статьи по специальности «Математика»

CC BY
32
14
i Надоели баннеры? Вы всегда можете отключить рекламу.
Область наук
Ключевые слова
Shannon entropy / Renyi entropy / Weighted entropy / Symmetry / Concavity

Аннотация научной статьи по математике, автор научной работы — Savita, Rajeev Kumar

A weighted entropy measure of information is provided by a probabilistic experiment whose basic events are described by their objective probabilities and some qualitative (objective or subjective) weights. Weighted entropy has also been applied to equity the amount of information and degree of homogeneity related with a partition of data in classes. These measures have tremendous applications and are found to be quite helpful in many fields. In the present paper, a new weighted Renyi’s entropy measure is proposed for the discrete distributions when probabilities are unknown and weights are known. The various characteristics of the measure are investigated. The measure is also studied taking into a particular case. In the last, numerical computation and graphical analysis is also done. Based on the graphical analysis, it is concluded that the proposed measure varies with values of weights and is concave in nature. The developed weighted information measure is useful for the discrete distribution when probabilities are unknown and weights are known.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «MATHEMATICAL MODELING OF AVERAGE WEIGHTED RENYI’S ENTROPY MEASURE»

Savita, Rajeev Kumar RT&A, No 3 (74) AVERAGE WEIGHTED RENYI'S ENTROPY_Volume 18, September 2023

MATHEMATICAL MODELING OF AVERAGE WEIGHTED RENYI'S ENTROPY MEASURE

Savita, Rajeev Kumar

Department of Mathematics, Maharishi Dayanand University, Rohtak. savitanain25@gmail.com, profrajeevmdu@gmail.com

Abstract

A weighted entropy measure of information is provided by a probabilistic experiment whose basic events are described by their objective probabilities and some qualitative (objective or subjective) weights. Weighted entropy has also been applied to equity the amount of information and degree of homogeneity related with a partition of data in classes. These measures have tremendous applications and are found to be quite helpful in many fields. In the present paper, a new weighted Renyi's entropy measure is proposed for the discrete distributions when probabilities are unknown and weights are known. The various characteristics of the measure are investigated. The measure is also studied taking into a particular case. In the last, numerical computation and graphical analysis is also done. Based on the graphical analysis, it is concluded that the proposed measure varies with values of weights and is concave in nature. The developed weighted information measure is useful for the discrete distribution when probabilities are unknown and weights are known.

Keywords: Shannon entropy, Renyi entropy, Weighted entropy, Symmetry, Concavity.

I. Introduction

Shannon's [22] entropy is widely prevalent in the study of probabilistic phenomena pertaining to abroad spectrum of problems. It was given by Shannon as a mathematical function to measure the uncertainty involved in a probabilistic experiment. Shannon entropy for a discrete source is defined as follows:

Let X be a probabilistic experiment with sample space x and probability distribution P, where p(x;) or p; is the probability of outcome x; e X . Then the average amount of information is given by

n rt

H(P) = - £ p(x; )logp(x;) = -£ pilnpi (1)

i-1 i-1

Renyi [19] introduced a flexible extension of Shannon entropy and also, it is one parameter generalization of Shannon entropy. In the analysis of quantum systems and measure of randomness, Renyi entropy is mostly used. Renyi defined entropy as

n

! Zpr

H(P) =-log "IT-, a * 1,a >0 (2)

1- a

£ pi

i-1

Savita, Rajeev Kumar RT&A, No 3 (74) AVERAGE WEIGHTED RENYI'S ENTROPY_Volume 18, September 2023

which is also known as Renyi's entropy measure of order a . Notice that Shannon's entropy measure

is the limiting case of Renyi's entropy measure when a ^ 1 .

The probabilistic measure of entropy (1) acquires a number of interesting properties. After the development of this measure, researchers found the potential of the application of this measure as an important quantity in many fields, from probability theory to engineering, ecology, and neuroscience. On the basis of this a large number of other information theoretic measures have also been derived.

Occasionally, standard distributions in statistical modelling are not appropriate for our data and we need to study weighted distribution. This concept has been applied in a variety of statistical fields, including family size analysis, human heredity, world life population study, renewal theory, biomedical and statistical ecology.

Fisher [5] originated the concept of a weighted distribution from his research into the impact of ascertainment methods on frequency estimates. Rao [17,18] expanded on Fisher's core ideas by discussing the need for a unifying idea and identifying several sample scenarios that can be replicated by what he called weighted distributions. The particular case of weighted distribution is the Size-biased distribution. These distributions naturally appear in real-world situations when observations from a sample are recorded with unequal probability. The utility distribution

W = (w,w2,......wn) , where each w is a non-negative real number, is proposed by Belis and Guiasu

[1] to measure utility aspect of the outcomes.

The applications of the measure to the theory of questionnaires were given by Guiasu and Picard [7]. Longo [12] applied this useful measure to coding theory. Moreover, in many situations, there is need of a measure of uncertainty of a distribution whenever probabilities are unknown and however weights for each value of the random variable are known, i.e. if X be a discrete random variable

having values x:,x2,......xn having weights wt,w2,......wn respectively but ......pnare

unknown. Patsakis et al. [16] gave the applications of the measure in security quantification.

The suitable generalization of classical entropy is the weighted entropy, which has been proposed by Belis and Guiasu [1], Guiasu [6]. A detailed discussion on weighted entropies have been made by Suhov and sekeh [24]. Mahdy [13] studied the weighted entropy measures and its application in Reliability theory and stochastic. Using two weighted entropy measures, Singh et al. [23] provided the applications of Holder's inequality to coding theory. Some other substantial measures of weighted entropy introduced by Kapur [9] are as under:

n

• Ha(P:W) = -J wPa!np, V2 < a < 1 (3)

i=l

Ha (P : W) = - J wiPilnpi + l/a£ wi (l + api )In(1 + api)

• w w , (4)

n x '

-wmin.J (1 + a)In(1+a)Pi

i=1

where w =mm(w1,w_,.........w)

min. ^ 1' 2' 3 '

Some characterizations and generalizations of the weighted measure have been provided by Longo [12], Hooda and Tuteja [8], Taneja and Hooda [25], Parkash and Taneja [15], Taneja and Tuteja [26], Kapur [9], Kumar et.al. [10, 11], Endo and Kudo [4], Mohammadi [14], Savita and Kumar [21], Bhat and Pundir [2], Sahni and Kumar [20] etc. Thus, weighted measures of information find tremendous applications and are quite helpful to the researchers in many fields.

II. Average Weighted Entropy Measure

Average weighted Renyi's entropy measure is proposed as under:

( n n A

HX(w):

1

1-a

ln

I(w;)a / £w.

i=1

i=1

(5)

n

where w. >0 and Iw. =w (constant).

i=1

The important characteristics of the measure (5) are investigated as under:

• It is a continuous and non -increasing function of a.

• It is permutationally symmetric function of wt,w2,......wn, i.e., it does not change when

w, w2,......wn are permuted among themselves.

• H (w) is non-negative for a < 0 and negative for a> 0.

• Expans ible property: This property is satisfied for the measure (5) which states that entropy does not change by the addition of weight with zero value.

Hj (w, w2,......., wn, 0) = Hj (w, w2,......., wn)

Here we use convention 0a =0 for all real values of a.

• The maximum value of the function can be obtained by considering following Lagrangian:

( n n A

L

1

1-a

ln

I(w.)a / £ w. - A(£ wi -w)

V i=1

i=1 V

i=1

Now

dL

1

dL

dw1 (1 - a)

1

dw2 (1 - a)

aw,

5>i"

Iwia

- A

Continuing like this

1

dL

5wn (1- a)

Iw/"

Now

dL

dw.

= 0 ^

dL dL dL

dw1 dw2 dw3

(1- a)

Iwia

-A :

(1-a)

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

- A =.

w-1 = w2a-1 = w3a-1

. = w

a-1

dL

dw„

= 0

(1-a)

Iw/"

A

a-1

a-1

a-1

1

1

1

aw

aw

aw

2

n

i=1

i=1

i=1

which is possible only if w1 = w2 = w3 =............= wn

n W

Also, J w = w ^ nw = w ^ w = — J 1 1 1 n

1=1

Thus, the maximum value of function will exist at w; = w and is given by

n I

1 J

1 i=1

n / \a

w

H(w) ^

-ln-(1- a) w

lnw'

-1 -ln ( na -1)

(1-a)

A dditive property: The measure (5) is additive in nature since when

Hn1(w) = "-ln

1-a

£(Wi)a / £Wi

V i=i

i=1

(( „

andHm1(w ) = Z-ln

1-a

m m

Z(wj)a / £

w

j=1

j=1

Then Hn1+m1(w ^ w ) =

(1- a)

ln

ЛЛ

££(wiwj )a

V i=1 j=1

££wiw

i=1 j=1

Hn1(w) + Hm1(w )

Concave property: Since for the measure (5),

H!(w) = — ln 1-a

£ (wi)

i=1

£ wi

Ha '(w) =

£(wi )a-1

Л

V i=1

(

a

V i=1 у

(1- a)

£ wia

i=1

a

H (w) = -

£(w)a-2 a2 £(w;)a-1

V i=1

V i=1

(^ ^ ( n A

£ wia (1- a) £ Wi

V i=1

V i=1

Thus, the measure (5) is concave upward for a > 1. I. Particular Cases

when w = p 0 < p < 1 then measure (5) becomes Renyi type entropy measure. when w = p 0 < p < 1 and a ^ 1

then measure (5) becomes Shannon type entropy measure. • when a ^ 1

then measure (5) reduces to average weighted Shannon's entropy measure.

1

Savita, Rajeev Kumar RT&A, No 3 (74) AVERAGE WEIGHTED RENYI'S ENTROPY_Volume 18, September 2023

III. Numerical Computation and Graphical Analysis

For computation of various values of the measure Hx(w) given in (5), three values of weights w,w2 and w3such that w = w +w2 + w3 are considered. Total weights w are assumed to take values 50, 100 and 150 and a = 3. The computed values of these cases are presented in table 1. Various graphs are plotted for the obtained values of H1(w) w.r.t. different weights to study the behavior of the information measure (5).

Table 1: Average weighted entropy measure

W=150 H(w) W=100 H(w) W=50 H(w)

w1 w2 w3 w1 w2 w3 w1 w2 w3

5 50 95 -4.3936 5 30 65 -4.0061 5 10 35 -3.3900

10 50 90 -4.3241 10 30 70 -3.8999 10 10 30 -3.1815

15 50 85 -4.2536 15 30 55 -3.7923 15 10 25 -2.9957

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

20 50 80 -4.1832 20 30 50 -3.6889 20 10 20 -2.9145

25 50 75 -4.1148 25 30 45 -3.5993 25 10 15 -2.9957

30 50 70 -4.0508 30 30 40 -3.5366 30 10 10 -3.1815

35 50 65 -3.9948 35 30 35 -3.5139 35 10 5 -3.3900

40 50 60 -3.9505 40 30 30 -3.5366

45 50 55 -3.9219 45 30 25 -3.5993

50 50 50 -3.9120 50 30 20 -3.6889

55 50 45 -3.9219 55 30 15 -3.7923

60 50 40 -3.9505 60 30 10 -3.8999

65 50 35 -3.9948 65 30 5 -4.0061

70 50 30 -4.0508

75 50 25 -4.1148

80 50 20 -4.1832

85 50 15 -4.2536

90 50 10 -4.3241

95 50 5 -4.3936

Figure 1: Information measure H(w) w.r.t. Wj for

Figure 2: Information measure H(w) w.r.t. Wj for

w = 50 and w2 = 5

w = 50 and w2 = 10

Figure 1 and Figure 2 depict the graph between Hx(w) and wj for different values of , total weight w = 50 and we have fixed w2 = 5,10. From the graphs, it can be seen that H(w) increases as

w

weights increases up to w; = —, and thereafter it decreases.

n

a 7 Information measure H1(W) versus Wt

w ( ) » u*^ « W IK

-3.Î W=100

? * W,=1D

> r -4.1

42

■4Î

-14 Wi

Figure 3: Information measure H^(w) w.r.t. w^ for

Information measure Hj[W) versus Wj

■14 ■IS

30 40 60 »

■IS r7 r-u W=100 «¡ = 30

-3.5

-4

-11

Figure 4: Information measure H^(w) w.r.t. w^ for

w = 100 and w2 = 10

w = 1 00 and w2 = 30

Figure 3 and Figure 4 depict the graph between Hx (w) and wj for different values of wi , total weight w = 100 and we have fixed w = 10, 20, 30. From the graphs, it can be seen that H (w)

w

increases as weights increases up to w; = —, and thereafter it decreases.

in

Figure 5: Information measure H^(w) w.r.t. w^ for

Figure 6: Information measure H^(w) w.r.t. w^ for

w = 150 and w2 = 20

w = 150 and w2 = 40

Figure 5 and Figure 6 depict the graph between Hx (w) and wj for different values of wi , total weight w = 150 and we have fixed w 2 = 10, 20, 30, 40, 50. From the graphs, it can be seen that Hj(w)

w

increases as weights increases up to w; = —, and thereafter it decreases.

in

IV. Conclusion

The proposed weighted Renyi's measure varies with values of weights and is concave in nature. In case, weights of the distribution are their probabilities, the average weighted Renyi entropy measure reduces to Renyi 's entropy. In the weighted sense, this is in fact generalizations of the Shannon [22] entropy and Burg [3] entropy. The developed weighted information measure is useful for the discrete distributions when probabilities are unknown and weights are known.

References

[1] Belis, M. and Guiasu, S. (1968). A quantitative-qualitative measure of information in cybernetic systems, IEEE Transactions on Information Theory, 14:593-594. https://doi.org/10.1109/TIT.1968.1054185

[2] Bhat, V.A. and Pundir, S. (2023). Weighted intervened exponential distribution as a lifetime distribution. Reliability: Theory & Applications. 2(73): 24-38. https://doi.org/10.24412/1932-2321-2023-273-253-266

[3] Burg, J.P. (1972). The relationship between maximum entropy spectra and maximum likelihood spectra. In: Childrers, D.G. (ed), Modern Spectral Analysis,130-131.

[4] Endo, T. and Kudo, M. (2013). Weighted naive Bayes classifiers by Renyi entropy. J. Ruiz-Shulcloper and G. Sanniti di Baja (Eds.), Springer-Verlag Berlin Heidelberg, CIARP, Part I, LNCS 8258, 149-156.

[5] Fisher, R.A. (1934). The effects of methods of ascertainment upon the estimation of frequencies. The Annals of Eugenics, 6: 13-25.

[6] Guiasu, S. (1986). Grouping data by using the weighted entropy. Journal of Statistical Planning and Inference, 15:63-69. https://doi.org/10.1016/0378-3758(86)90085-6

[7] Guiasu, S. and Picard, C.F. (1971). Borne in ferictur de la longuerur utile de certains code. Comptes Rendus Mathematique Academic Des Sciences Paris, 273: 248-251.

[8] Hooda, D.S. and Tuteja, R.K. (1981). Two generalized measures of useful information. Information Sciences, 23: 11-24.

[9] Kapur. J.N. (1995). Measures of information and their applications. Wiley Eastern, New York.

[10] Kumar, R., Kumar, S. and Kumar, A. (2010). A new measure of probabilistic entropy and its properties. Applied Mathematical Sciences, 4(28): 1387-1394.

[11] Kumar, R., Kumar, S. and Kumar, A. (2010). Some new information theoretic measures based upon statistical constants. Aryabhatta Journal of Mathematics and Informatics, 2(2): 329-338.

[12] Longo, G. (1972). Quantitative-qualitative measure of information. Springer-Verlag, New York.

[13] Mahdy, M. (2018). Weighted entropy measure: a new measure of information with its properties in reliability theory and stochastic orders. Journal of Statistical Theory and Applications. 17(4):703-718. D0I:10.2991/jsta.2018.4.17.11

[14] Mohammadi, U. (2015). Weighted entropy function as an extension of the Kolmogorov-Sinai entropy. U.P.B. Sci. Bull., Series A, 77(4): 117-122.

[15] Parkash, O. and Taneja, H.C. (1986). Characterization of quantitative-qualitative measure of inaccuracy for discrete generalized probability distributions. Comm. Statistics theory and Methods, 15(12): 3763-3772.

[16] Patsakis, C., Mermigas, D., Pirounias, S. and Chondrokoukis, G. (2013). The role of weighted entropy in security quantification. International Journal of Information and Electronics Engineering, 3(2):156-159.

[17] Rao, C.R. (1965). On discrete distributions arising out of methods of ascertainment, in Classical and Contagious Discrete Distributions. G.P. Patil, ed., Pergamon Press and Statistical Publishing Society, Calcutta,320-332.

[18] Rao, C.R. (1985). Weighted distributions arising out of methods of ascertainment, in A Celebration of Statistics, A.C. Atkinson & S.E. Fienberg, eds, Springer-Verlag, New York, Chapter 24:543-569.

[19] Renyi, A. (1961). On measures of entropy and information. Proceedings IV Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, 20-30 June 1961,1: 547-561.

[20] Sahni, P. and Kumar, R. (2023). Huntsberger type shrinkage entropy estimator for variance of normal distribution under linex loss function. Reliability: Theory & Applications. 1(72): 491499. https://doi.org/10.24412/1932-2321-2023-172-491-499

[21] Savita and Kumar, R. (2019). On A New Weighted Entropy Measure. Journal of Advanced research in dynamical and control systems. 11(01)-special issue:1264-1271.

[22] Shannon, C.E (1948). A mathematical theory of communication, Bell System Technical Journal, 27:379-423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.

[23] Singh, R.P., Kumar, R. and Tuteja, R.K. (2003). Applications of holder's inequality in information theory. Information Sciences, 152: 145-154.

[24] Suhov, Y., Sekeh, S.Y. (2015). Weighted cumulative entropies: An extension of CRE and CE. arXiv: 1507.0705[v] [cs.IT]. https://doi.org/10.48550/arXiv.1507.07051

[25] Taneja, H.C. and Hooda, D.S. (1983). On characterization of generalized measure of useful information, Sooch. Jr. Math, 9: 221-230.

[26] Taneja, H.C. and Tuteja, R.K. (1986). Characterization of quantitative-qualitative measure of inaccuracy, Kybernetika, 22: 393-402.

i Надоели баннеры? Вы всегда можете отключить рекламу.