Научная статья на тему 'QUANTILE RESIDUAL ENTROPY FOR SOME LIFE TIME DISTRIBUTIONS'

QUANTILE RESIDUAL ENTROPY FOR SOME LIFE TIME DISTRIBUTIONS Текст научной статьи по специальности «Математика»

CC BY
48
18
i Надоели баннеры? Вы всегда можете отключить рекламу.
Область наук
Ключевые слова
Shannon entropy / Residual entropy / Quantile function / Residual entropy

Аннотация научной статьи по математике, автор научной работы — Javid Gani Dar, Mohammad Younus Bhat, Shahid Tamboli, Shaikh Sarfaraj, Aafaq A. Rather

This study explores the concept of residual entropy as an alternative approach to traditional entropy measures. The field of information theory, built upon Shannon's entropy, has been instrumental in understanding the dynamics of systems. However, existing literature has recognized the limitations of applying traditional entropy measures to systems that have already been in existence for a certain duration. This study delves into the concept of residual entropy, acknowledging the need for a more suitable approach for such systems. Specifically, we investigate the characteristics of residual entropy using a quantile-based framework. By deriving the quantile residual entropy function for various lifetime models, we gain insights into the reordering and ageing phenomena captured by the quantile version of the residual entropy equation. Our findings contribute to an enhanced understanding of residual entropy and provide a novel perspective on analyzing and interpreting the behavior of established systems.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «QUANTILE RESIDUAL ENTROPY FOR SOME LIFE TIME DISTRIBUTIONS»

QUANTILE RESIDUAL ENTROPY FOR SOME LIFE TIME

DISTRIBUTIONS

Javid Gani Dar1, Mohammad Younus Bhat2, Shahid Tamboli3, Shaikh Sarfaraj4, Aafaq A. Rather5*, Maryam Mohiuddin6, Showkat Ahmad Dar7

department of Applied Sciences, Symbiosis Institute of Technology, Symbiosis International (Deemed University) (SIU), Lavale, Pune, Maharashtra, India 2Department of Mathematical Sciences, Islamic University of Science and Technology, Kashmir, India 3,4Mechanical Engineering Department, Symbiosis Institute of Technology, Symbiosis International

(Deemed University) (SIU), Lavale, Pune, Maharashtra, India 5,*Symbiosis Statistical Institute, Symbiosis International (Deemed University), Pune-411004, India 6Department of Mathematics, National Institute of Technology, Calicut, Kerala, India 7Department of Statistics, University of Kashmir, J&K, India 1javid.dar@sitpune.edu.in, 2younis.bat@islamicuniversity.edu.in, 3shahidt@sitpune.edu.in, 4sarfarajs@sitpune.edu.in, 5*aafaq7741@gmail.com, 6masmariam7@gmail.com, 7darshowkat2429@gmail.com

Abstract

This study explores the concept of residual entropy as an alternative approach to traditional entropy measures. The field of information theory, built upon Shannon's entropy, has been instrumental in understanding the dynamics of systems. However, existing literature has recognized the limitations of applying traditional entropy measures to systems that have already been in existence for a certain duration. This study delves into the concept of residual entropy, acknowledging the need for a more suitable approach for such systems. Specifically, we investigate the characteristics of residual entropy using a quantile-based framework. By deriving the quantile residual entropy function for various lifetime models, we gain insights into the reordering and ageing phenomena captured by the quantile version of the residual entropy equation. Our findings contribute to an enhanced understanding of residual entropy and provide a novel perspective on analyzing and interpreting the behavior of established systems.

Keywords: Shannon entropy, Residual entropy, Quantile function, Residual entropy.

1. Introduction

Physicists first created the idea of entropy in the backdrop of the equilibrium thermodynamics, which was then broadened with the creation of statistical mechanics. Claude Shannon devised a mathematical notion called entropy to describe the stochastic nature of missing data in phone-line transmissions [19]. According to the concept of differential entropy for a continuous random variable X having density functions fx(x) is given by

H(X) = - /„" fx(x) log fx(x)dx (1)

In order to generalize the Shannon entropy evaluation, Mathi and Haubold [12] created a new generalized entropy evaluation linked to a random variable X having pdf f(x), is expressed as

Ma(X; t) = ^{pL-Mclx -l],0<a<2,a±1 (4)

Ma(X) (Cjf(x)]2~adX -l),0<a<2,a*1 (2)

When a ^ 1 (2) goes to the Shannon entropy measure defined in (1).

One feature of (2), as described in [11] and [20], is that combining the maximal entropy concept with normalization and power limitations prompts the widely recognized route model [12]. It should be highlighted that the route model incorporates several well-known statistical distributions as special instances.

The date is frequently abbreviated in reliability as well as life evaluation, thus equation (1) isn't an acceptable metric in such cases. Thus, if there is data about the present age, that may be utilized for determining its degree of unpredictability, Shannon's entropy is inapplicable. Ebrahimi [7] proposes a more realistic method that takes ageing into consideration and is characterized as

_ »(^) = -ClMi0gW)dx (3)

where F(t) is the survival function. For t = 0, (3) reduces to (1)

The remainder of the Mathai-Haubold (Dar and Al-Zahrani, [5]) entropy functional can be expressed for a positive random parameter X, which represents the lifespan of a unit at time t. It is given by equation (4)

f2-a

Ht)

Any theoretical studies and implementations that utilize the aforementioned metrics depend upon the distribution function, however they may not be appropriate in cases when the distributions aren't analytically tractable. A different technique to investigate is to employ quantile functions, which are specified by

Q(u) = F~1(u) = inf{x/F(x) >u},0<u<1 (5)

Gilchrist [9] has presented an alternative to the distribution function in statistical data analysis and modeling known as the quantile function (QF). QF is often favored since it provides a straightforward analysis with less influence from extreme observations. To learn more about the properties and usefulness of QF in identifying models, Nair and Sankaran [14], Sunoj et al [17], and related sources offer detailed studies. Researchers have become increasingly interested in using quantile-based entropy measures as an alternative approach to measuring the uncertainty of a random variable. These measures possess unique properties compared to the distribution function approach. Sunoj and Sankaran [17] have recently explored the quantile version of Shannon entropy and its residual form as

H = Jl\ogq(p)dp (6)

and

H(u) = H(X; Q(u)) = log(1 - u) + ^Jl\ogq(p)dp (7)

respectively, where the quantile density function q(u) = can be defined by fQ(u) = f(Q(u)) and Using these definitions, equation (8) is expressed as,

q(u)f(Q(u)) = 1 (8)

Further, Nanda et.al. [13] introduced quantile based Renyi's entropy function and study properties and applications of the proposed measure.

In this paper, we introduce a novel quantile-based version of the generalized residual entropy function and examine its key properties. We first demonstrate that the proposed measure provides a unique determination of the quantile distribution functions and derive an entropy function for specific quintile functions that lack an explicit form for distribution functions. Additionally, we define ordering and aging properties based on the quantile residual entropy function and analyze their characteristics. Lastly, we present a characterization of certain lifetime models that are valuable in analyzing lifetime data.

This paper is organized as follows. Section 2 outlines the development of our new quantile-based residual entropy measure and explores its properties. Section 3 delves into the aging and

ordering properties of the quantile-based residual entropy. Finally, Section 4 presents a set of characterization theorems based on the quantile residual entropy measure.

2. Quantile based M-H Residual Entropy

From equation (5), we can see that for a continuous distribution function F, the composite function F(Q(u)) is equal to u, represented as FQ(u). We can define the density quantile function as fQ(u) =f(Q(u)) and the quantile density function as q(u) = Q'(u) where the prime notation indicates differentiation. Equation (4) and (8) together yield the expression for the quantile version of the M-H residual entropy.

M„(X; Q(u)) - 1^,0 < a < 2.a * 1 (9)

M.(X;QM) (10)

The expression (10) is known as the quantile M-H residual entropy, which quantifies the average level of uncertainty in the conditional density with respect to predicting an outcome of X up until the 100 (1-u)% point of the distribution. Following theorem shows the uniqueness of quantile version of residual entropy.

Theorem 2.1: Show that the quantile version of residual entropy determines the underlying distribution uniquely.

Proof: Differentiating (10) with respect to u, we get

(qtu))"-1 = (1-u)

1-a

(11)

\2-a)[(a-1)Ma(X;Q(u)) + l}' -(1-u)(a-1)M'a(X;Q(u)) The equation establishes a clear connection between the quantile density function q(u) and Ma(X;Q(u)), which shows that quantile version of M-H residual entropy determine the underlying distribution uniquely. The next two theorems gives the bounds of Ma(X; Q(u)). The proof follows by using (10).

Theorem 2.2: If Ma(X; Q(u)) is increasing u, then

i

i

q(u) < (>)^[(a - 2){(a - 1)Ma(X; Q(u)) + l}]1"« for 0 < a < 1 (1 < a < 2) (12)

If Ma(X; Q(u)) is deccreasing u, then

i

q(u) > (<) ^ [(a - 2){(a - 1)Ma(X; Q(u)) + l}]1"« for 0 < a < 1(1 < a < 2) (13)

In the following section, we will derive the quantile version of the residual entropy for several lifetime models.

2.1 Govindarajulu's Distribution

Firstly, we will consider Govindarajulu's distribution, for which the quantile function and quantile density function are given by:

Q(u) = a{(b + 1)ub - bub+1} and q(u) = ab(b + 1)(1 - u)ub-1,0 <u<1;a,b > 0. Quantile based residual M-H Entropy of rth for Govindarajulu distribution as

M*(X. Q(U)) + + - 1} (14)

where fiu(a; b) is an incomplete beta function.

2.2 Uniform Distribution

Q(u) = a + (b - a)u and q(u) = (b - a).0 < u < 1; a < b.

Ma(X; Q(u)) = -{(b- a)a—1(1 - u)a—1 - 1} (15)

& — 1

2.3 Pareto-I Distribution

Q(u) = b{(1-u)—1/a} and q(u) = ^{(1 - u)—(l+a)j,0 <u<1;a,b> 0. Ma(X,Q(u) =^\(b-)a—1 "(1—^(1-a) - 1} (16)

v vv J a—1 I W a(2—a) + (1—a) ( v '

2.4 Generalized Pareto Distribution

Q(u) =b[(1-u)—a/a+1 - 1} and q(u) = -^{(1 - u)—(H+I+1)},0 < u<1;a,b>0.

= (17)

v J a—1 i \a+1/ (1+a)(2—a) + a(1—a) I v '

2.5 Re-Scaled Beta Distribution

Q(u) = fi{1 - (1-u)1/c)j and q(u) = *{(1 - u)C—1},0 <u<1;a,b > 0.

Ma(X,Q(u) = -JU p)"—1 c^"-1 - 1} (18)

v vv J a—1\\cJ c(2 — a) + (a—1) v '

2.6 Exponential distribution

Q(u) = -lo8(1-u) andq(u)=j^-y0<u<1;A>0.

2.7 Power distribution

- a 1-1 Q(u) = auc and q(u) =-uc ,0<u<1;A>0

(X,Q(u)) = — ^-1} (20)

Following tables and figures gives the computations and comparison of quantile of M-H residual entropy respectively for different parameters for some life time distributions

1

o

CO"

o

CD

o

O (N

lambda=0.1

-a a-

0.5

1.0

1.5

O

X

coco—

(N—

lambda=0.9

0.5

1.0

1.5

O

X

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

ro_

0.5

1.0

1.5

i-r

0.5 1.0

1.5

a a

Fig. 1: Quantile residual entropy plot of exponential distribution

Table 1: Qunitile M-H residual entropy values exponential distribution

2

Parameter Values 0.1 0.6 0.9 1.5 3.5 9.4

0.1 1.03749 0.74184 0.57922 0.26877 -0.69467 -3.28247

0.3 1.26090 0.84087 0.64798 0.31244 -0.59119 -2.60456

0.5 1.57836 0.96720 0.73509 0.36701 -0.49444 -2.08792

0.7 2.04824 1.13355 0.84901 0.43757 -0.40051 -1.68863

a 0.9 2.77883 1.36182 1.00437 0.53291 -0.30420 -1.37418

1.1 3.98806 1.69344 1.22880 0.66961 -0.19719 -1.11935

1.3 6.16792 2.21721 1.58149 0.88318 -0.06324 -0.90201

1.5 10.64911 3.16398 2.21637 1.26599 0.13809 -0.69534

1.7 22.43749 5.38030 3.69781 2.15666 0.55265 -0.43639

1.9 87.14758 16.48519 11.10518 6.60281 2.48718 0.36780

O

X

o CO"

o -ü-

o

OJ

o—

a=0.5,b=0.9,u=0.9

i

0.5 1.0 1.5

o

CO"

O 2-

o—

o ■t-

a=0.5,b=5.3,u=0.9

O—O o o-

—1-r~

0.5 1.0

1.5

a a

Fig. 2: Qunitile residual entropy plot for Pareto I Distribution

Table-2: Qunitile residual entropy values for Pareto I Distribution

Paramet ers a=0.5, b=0.3, u=0.5 a=1.4, b=0.3, u=0.5 a=2.5, b=0.3, u=0.5 a=0.5, b=0.9, u=0.9 a=0.5, b=2.3, u=0.9 a=5.3, b=0.3, u=0.9

0.1 1.759 4.442 7.484 0.655 0.281 0.133

0.3 1.912 3.821 5.670 0.945 0.490 0.273

0.5 2.066 3.157 4.075 1.431 0.895 0.590

0.7 2.366 2.695 3.002 2.246 1.695 1.320

a 0.9 3.029 2.445 2.322 3.813 3.472 3.194

1.1 4.955 2.446 1.929 8.073 8.867 9.639

1.3 30.698 2.973 1.805 63.668 84.366 108.375

1.5 -5.453 7.239 2.217 -^4.245 -22.773 -34.569

1.7 -2.207 -3.718 21.216 ^7.214 -13.912 -24.956

1.99 -1.275 -0.988 -1.045 -5.188 -12.070 -25.585

Table-3: Qunitile residual entropy values for uniform distribution

Paramet ers a=0.9, b=2.5, u=0.5 a=1.4, b=2.5, u=0.5 a=2.3, b=2.5, u=0.5 a=0.5, b=0.6, u=0.9 a=0.5, b=0.9, u=0.9 a=0.5, b=1.5, u=0.9

0.1 -0.247 -0.792 -7.715 -68.995 -19.022 -7.715

0.3 -0.242 -0.742 -5.731 -34.456 -12.169 -5.731

0.5 -0.236 -0.697 -4.325 -18.000 -8.000 -4.325

0.7 -0.231 -0.655 -3.318 -9.937 -5.422 -3.318

a 0.9 -0.226 -0.616 -2.589 -5.849 -3.797 -2.589

1.1 -0.221 -0.580 -2.057 -3.690 -2.752 -2.057

1.3 -0.216 -0.547 -1.663 -2.496 -2.064 -1.663

1.5 -0.211 -0.517 -1.368 -1.800 -1.600 -1.368

1.7 -0.207 -0.489 -1.144 -1.372 -1.278 -1.144

1.9 -0.202 -0.462 -0.971 -1.094 -1.050 -0.971

The quantile residual entropy plots for the exponential, Pareto-I, and uniform distributions are shown in Figs. 1, 2, and 3. In the case of an exponential distribution, we have an increasing entropy plot for increasing values of the parameters and. For various parameter combinations, the entropy plot under the Pareto-I distribution exhibits both increasing and decreasing behavior. For various parameter values, the entropy plot under a uniform distribution also exhibits an increasing trend. Entropy values for exponential, Pareto-I, and uniform distributions are shown in Tables 1, 2, and 3, which depicts the same behavior as mentioned for graphical displays.

3. Ageing and Ordering Properties of Quantile based M-H Residual Entropy

These nonparametric classes of life distribution are defined using residual M-H quantile entropy. Definition increasing (decreasing) M-H residual quantile entropy (IMHRQE) is claimed to exist for X. (DMHRQE) if Ma(X,Q(u)) is increasing (decreasing) in u> 0.

Definition 3.2: If (X <mhqe Y), then X is smaller than Y in the M-H quantile entropy order.

M*(Qx(u)) < MZ(QY(u)) for all u £ [0.1] (21)

Definition 3.3: If (X <mhqfr Y), then X is smaller than Y in the M-H quantile failure rate.

Hx(u) >HY V u>0

The following lemma is useful in proving the results in monotonicity of Ma(X, Q(u)). Let f(u.x): R+ ^ R+ and g: R+ ^ R+ be any two functions, according to Lemma 3.1 Given that the integrals exist, if f(u.x)dx is increasing and g(u) is increasing (decreasing) in u, then J™ f(u. x)g(x)dx is increasing (decreasing) in u.

Theorem 3.1: Consider that X is a continuous random variable that is non-negative and contains the quantile function Qx(.) and density function qx(.). Define Y = 0(X), as a nonnegative, where 0(.) increasing, convex(concave) function. (i) Because whenever Ma(X.Q(u)) is increasing (decreasing) in u, For 1 < a < 2, Ma(X.Q(u)) is increasing (decreasing) in u as well. (ii) Because wheneverM"(X. Q(u)) is increasing (decreasing) in u, For 0 < a < 1, Ma(X.Q(u)) is increasing (decreasing) in u as well.

Proof: (i) Using the stated condition, we may infer that Ma(X, Q(u) = -

(decreasing) in u and denote qx (. ) as the quantile density function of X. According to definition 11

(l-u)2

-1

increasing

1

a-1 1

mz(y,qy(u)) = mz(y,qy(u))=-1-

a-1

my(y,qy(u))=-^~

(1-u)2

- 1

¡u[qx(p)0'(Qx(p))]

(1-u)2-a

ti(qx(p)f-1(<t>'(Qx(p)))a-1

(1-u)2

(22)

(23)

(24)

(0'(Qx(p))a-1 is increasing (decreasing) and is non-negative, therefore by Lemma 3.1, (3.1) increasing (decreasing), which proves (i) of the Theorem. This is because 1 < a < 2 and 0 is non-negative, increasing (decreasing), and convex(concave) function.

Similarly, when 0 < a < 1, (0'(Qx(p))a-1 increasing (decreasing) in p because is increasing and convex. As a result, theorem (3.1) is increasing (decreasing) in u, demonstrating the second part of

the theorem. The preceding Theorem is immediately applied as follows.

i

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Assume Y = Xa, a > 0. as well as X has an exponential distribution with a failure rate of. The

i i

Weibull distribution with q(u) = A-a(- log(1 — u))a. becomes then shown in Y. In the occurrence of 1<a<2,(0<a< 1)., the non-negative increasing function 1 < a < 2, (0 < a < 1). is convex (concave). The Weibull distribution is hence increasing (decreasing) M-H quantile residual entropy if EE, according to Theorem 3.1: We now present a lemma that indicates the closure of the MHQE order under increasing convex transformation.

Let f(u,x)\ [0,1] x R+ ^ R+ be such that J^f(u,x)dx > 0,V u £ [0,1] and g(x) be any nonnegative function in x, then prove Lemma 3.2. J^ f(u,x)g(x)dx > 0,Vu £ [0,1] Theorem3.2 states that if X <MHQE Y., consequently let X and Y be two random variables. Following that, for any non-negative increasing convex function 0, 0(X) < mhqe 0(Y). Proof: To show that 0(X) <mhqe 0(Y), it is enough to show that

1

a-1

iUl(^X(p))a-1(0'(0x(P))) dp

(1-u)2

— 1

<

iUl(^Y(p))a-1(0'(0Y(P))) dp

(1-u)2

—1

V u £ [0,1]

Two cases arise:

(i) Consider the case when 1 < a <2. Since X <mhqe Y, we have V u £ [0,1]

sU,(ix(p))a-1dp

(1-u)2

—1

<

iU,(lY(p))a

(1-u)2

—1

(25)

(26)

which is equalvient to

J1(clx(p))a-1dp < j1(ciY(p))a-1dp

(27)

Thus, Vu £ [0,1] Qx(u) < qy(u) and consequently Qx(u) < QY(u). have been deduced from (27). Meanwhile, 0'(0x(u)) < 0'(0Y(u)) is caused by 0'(.) increasing in u because (.) is convex. Thus inequality (12) follows from (13) and Lemma 3.2. (ii) Consider the case when 0 < a < 1.

The proof develops in a way that's similar to the rest of the case (i).

Theorem 3.3: If X <

MHQFR

Y, then X < Proof: It is simple and hence omitted.

M HQE

Y.

4. Characterization Theorems

The hazard quantile function, which is a simplification of the well-known hazard function and is valuable in reliability analysis, is a significant quantile measure.

H(u) = h(Q(u)) =

fQ(u)

(28)

(1-u) (1-u)q(u)

Next we state some characterization results of some well-known life time's distribution based on

1

1

a-1

i

1

a-1

a-1

1

quantile residual entropy.

Theorem 4.1: Let X represent a random variable with an M-H residual entropy of the following form:

Ma(X;Q(u)) = ^{c(H(u))1-a-l},0<a<2,a^1 (29)

only if and when

1

C = —, then X has an exponential distribution.

9 — rr *

2—a

With a quantile density function, X has a Pareto distribution. If C then q(u) = ^{(1 — u)-(l+a)} , 0<u<1;a,b> 0

If C > X has a quantile density function of q(u) = ^{(1 — u)(a-l)j , 0 < u < 1; b > 0,a > 1 with a

finite range distribution. X has uniform distribution if C = 1.

Proof: From section 2, the necessary part follows.

For converse part, let (29) is true, then using (9), we have

J_ qa—1(p)dp = C(H(u))1-a(1 — u)2-a (30)

Now using H(u) = (1_m) (M) in (30), subsequently separating it on both sides with regard to u, we obtain

g'ju) _ ( c i ^ (J_ q(u) \ca-c/ Vl—u

This gives

:M=(£Z1V_M (3!)

\(u) \ca-cj \l-uj v '

q(u)= A(1- uï&à) (32)

where A remains constant.

111

As a result, if C = —,C < —,C > —,C = 1, X has distributions that are respectively exponential,

2—a 2—a 2—a r J r

Pareto, and finite range.

5. Conclusion

This study has shed light on the concept of residual entropy and its relevance in the context of systems that have already been in existence for a specific duration. While Shannon's entropy serves as the foundation of information theory, the notion of residual entropy has emerged due to its inadequacy for such systems. By adopting a quantile-based approach, we have explored the characteristics of residual entropy in detail.

Through the derivation of the quantile residual entropy function for various lifetime models, we have provided a novel perspective on analyzing and understanding the dynamics of established systems. Our investigation has further allowed us to delve into the reordering and ageing aspects inherent in the quantile version of the residual entropy equation.

By extending the application of entropy measures to incorporate quantiles, we have bridged the gap in assessing the behavior of systems with a history of existence. This has opened up new avenues for research in information theory and its practical implications.

Overall, our study emphasizes the importance of considering residual entropy based on quantiles, offering a more comprehensive understanding of system dynamics and enabling more accurate analyses in various domains. Further research can build upon these findings to explore additional applications and refine the quantile-based approach to residual entropy.

Conflicts of Interest

No conflicts of interest exist, according to the authors, with the publishing of this article.

References

[1] Arnold, B. C., Blakrishan, N. and Nagraja, N. H. (1992). A first course in order Statistics. John Wiley and Sons, New York.

[2] Baratpour, S., Ahmadi, J. and Arghami, N. R. (2007). Some characterizations based on entropy of order statistics and record values. Communications in Statistical-Theory and Methods, (36), 4757.

[3] Behboodian, j. and Tahmasebi, S. (2008). Some properties of entropy for the exponentiated pareto distribution (EPD) based on order statistics. Journal of Mathematical Extension, (3) 43-53.

[4] Crescenzo, A. D and Longobardi , M. (2004). Entropy based measure of uncertainty in past lifetime distribution. Journal of Applied probability, (39) 434-440.

[5] Dar, Javid Gani. and Bandere, A.Z. (2013) On Some Characterization Results of Life Time Distributions using Mathai-Haubold Residual Entropy, International organization of Scientific Research-JM. (5), 56-60

[6] David, H. A. (1981). Order Statistics. Second Edition, John Willey and Sons New York.

[7] Ebrahimi, N (1996). How to measure uncertainty in the residual lifetime distribution, Sankhya Ser. A, (58) 48-56.

[8] Ebrahimi, N. et.,al (2004). Information properties of order statistics and spacing. IEEE Trans. Information Theory: (50), 177-183.

[9] Gilchrist, W. (2000). Statsticl modeling with quantile functions. Chapman and Hall/CRC. Boca Raton, FL

[10] Kullback, S and Leibler R. A. (1951). On information and sufficiency. Ann. Math. Stat. (22), 7986.

[11] Mathai A (2005) A pathway to matrix-variate gamma and normal densities. Linear Algebra Appl 396:317-328

[12] Mathai A, Haubold H (2007a) On generalized entropy measures and pathways. Phys A Stat Mech Appl 385(2):493-500.

[13] Nanda, A. K, Sankaran, P. G, and Sunoj, S.M. (2014). Renyi's residual entropy: A quantile approach. Statistics and Probability Letters (85) 114-121.

[14] Nair, N. U. and Sankaran, P. G. (2009). Quantile based reliability analysis. Commun. Statist.-Theo.Meth., (38 ) 222-232.

[15] Park, S. (1995). The entropy of consecutive order statistics. IEEE Trans. Inform. Theory, (41) 20032007.

[16] Paul, j. and Yageen, P. (2019). On some properties of Mathai-Haubold entropy of record values. J Indian Soc Probab Stat

[17] Sunoj, S. M. and Sankaran, P. G. (2012). Quantile based entropy function. Statistics and Probability Letters (82) 1049-1053.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

[18] Samuel, P. and Thomas, P. Y. (2000). An improved form of a recurrence relation on the product moment of order statistics. Commun. Statist.-Theo.Meth.,( 29)1559-1564.

[19] Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal. (27), 379-423 and 623-656.

[20] Sebastian N (2015) Generalized pathway entropy and its applications in difiusion entropy analysis and fractional calculus. Commun Appl Ind Math 6(2):1-20.

[21] Wong, K. M. and Chen, S. (1990). The entropy of ordered sequences and order statistics. IEEE Trans. Information Theory: (36), 276-284.

i Надоели баннеры? Вы всегда можете отключить рекламу.