Научная статья на тему 'LIMITS OF RISKS RATIOS OF SHRINKAGE ESTIMATORSUNDER THE BALANCED LOSS FUNCTION'

LIMITS OF RISKS RATIOS OF SHRINKAGE ESTIMATORSUNDER THE BALANCED LOSS FUNCTION Текст научной статьи по специальности «Математика»

CC BY
26
5
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
BALANCED LOSS FUNCTION / JAMES-STEIN ESTIMATOR / MULTIVARIATE GAUSSIAN RANDOM VARIABLE / NON-CENTRAL CHI-SQUARE DISTRIBUTION / SHRINKAGE ESTIMATORS

Аннотация научной статьи по математике, автор научной работы — Terbeche Mekki, Benkhaled Abdelkader, Hamdaoui Abdenour

In this paper we study the estimation of a multivariate normal mean under the balanced lossfunction. We present here a class of shrinkage estimators which generalizes the James-Stein estimatorand we are interested to establish the asymptotic behaviour of risks ratios of these estimators to themaximum likelihood estimators (MLE). Thus, in the case where the dimension of the parameter space andthe sample size are large, we determine the sufficient conditions for that the estimators cited previouslyare minimax.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «LIMITS OF RISKS RATIOS OF SHRINKAGE ESTIMATORSUNDER THE BALANCED LOSS FUNCTION»

DOI: 10.17516/1997-1397-2021-14-3-301-312 УДК 519.2

Limits of Risks Ratios of Shrinkage Estimators under the Balanced Loss Function

Mekki Terbeche*

University of Sciences and Technology, Mohamed Boudiaf, Oran Laboratory of Analysis and Application of Radiation (LAAR), USTO-MB

Oran, Algeria

Abdelkader Benkhaled^

Mascara University, Mustapha Stambouli Laboratory of Geomatics, Ecology and Environment (LGEO2E), Mascara University

Mascara, Algeria

Abdenour Hamdaoui*

University of Sciences and Technology, Mohamed Boudiaf, Oran Laboratory of Statistics and Random Modelisations (LSMA) of Tlemcen University

Oran, Algeria

Received 10.12.2020, received in revised form 04.02.2021, accepted 02.03.2021 Abstract. In this paper we study the estimation of a multivariate normal mean under the balanced loss function. We present here a class of shrinkage estimators which generalizes the James-Stein estimator and we are interested to establish the asymptotic behaviour of risks ratios of these estimators to the maximum likelihood estimators (MLE). Thus, in the case where the dimension of the parameter space and the sample size are large, we determine the sufficient conditions for that the estimators cited previously are minimax.

Keywords: balanced Loss Function, James-Stein estimator, multivariate Gaussian random variable, non-central chi-square distribution, shrinkage estimators.

Citation: M. Terbeche, A. Benkhaled, A.Hamdaoui, Limits of Risks Ratios of Shrinkage Estimators under the Balanced Loss Function, J. Sib. Fed. Univ. Math. Phys., 2021, 14(3), 301-312. DOI: 10.17516/1997-1397-2021-14-3-301-312.

Introduction

The multivariate analysis plays an essential role in statistical data analysis. Thus, the mean parameters estimation of the multivariate Gaussian distribution is of interest to many users. Stein [1] showed the inadmissibility of the usual estimator when the dimension of the parameter space is greater than or equal to three by considering an alternative estimator with uniformly smaller risk than the latter, the improvement being substantial for the mean close to the origin. A central focus is on the general technique, namely, shrinkage estimation. This is systematically applied to derive the MLE of the mean parameters. A large amount of research have been carried

* mekki.terbeche@gmail.com tbenkhaled08@ yahoo.fr

iabdenour.hamdaoui@yahoo.fr; abdenour.hamdaoui@univ-usto.dz © Siberian Federal University. All rights reserved

out to develop the properties of shrinkage estimators and to compare them with the MLE. For a selected review of the subject matter of shrinkage estimation, interested readers may refer to Stein [1], James and Stein [2] and Efron and Morris [3].

When the dimension of the parameter space and the sample size are large, Benmansour and Hamdaoui [4] have taken the model X ~ Np (9,a2Ip) where the parameter a2 is unknown and estimated by S2 (S2 ~ a2\^). The authors established the analogous results obtained by Casella and Hwang [5]. Benkhaled and Hamdaoui [6], have considered the same model given by Benmansour and Hamdaoui [4], namely X ~ Np (0, a2Ip) where a2 is unknown. They studied two different forms of shrinkage estimators of 9: estimators of the form S^ = (1 — ^(S2, \\2)S2/ \\2)X, and estimators of Lindley-Type given by S* = (1 — y(S2,T2)S2/T2)(X — X)+ X, that shrink the components of the MLE X to the random variable X. The authors showed that if the shrinkage function ^ (respectively y) satisfies the new conditions different from the known results in the literature, then the estimator S^ (respectively S*) is minimax. When the sample size and the dimension of parameters space tend to infinity, they studied the behaviour of risks ratio of these estimators to the MLE. Hamdaoui et al. [7], have treated the minimaxity and limits of risks ratios of shrinkage estimators of a multivariate normal mean in the Bayesian case. The authors have considered the model X ~ Np (9, a2Ip) where a2 is unknown and have taken the prior law 9 ~ Np (v, t2Ip). They constructed a modified Bayes estimator S*B and an empirical modified Bayes estimator SEB. When n and p are finite, they showed that the estimators SB and SEB are minimax. The authors have also interested in studying the limits of risks ratios of these estimators, to the MLE X, when n and p tend to infinity. The majority of these authors have been considered the quadratic loss function for computing the risk.

Zellner [8] proposes a balanced loss function that takes error of estimation and goodness of fit into account. This balanced loss function consists of weighting the predictive loss function and the goodness of fit term. In addition for estimation under the balanced loss function we cite for example, Guikai et al. [9], Karamikabir et al. [10]. Sanjari Farsipour and Asgharzadeh [11] have considered the model: X\,..., Xn to be a random sample from Np (9, a2) with a2 known and the aim is to estimate the parameter 9. They studied the admissibility of the estimator of the form aX + b under the balanced loss function. Selahattin and Issam [12] introduced and derived the optimal extended balanced loss function (EBLF) estimators and predictors and discussed their performances. Under the balanced loss function, Hamdaoui et al. [13] studied the behavior of risks ratios of James-Stein estimator and the positive-part of James-Stein estimator to the MLE, when the dimension of the parameter space tends to infinity and the sample size is fixes and when the dimension of the parameter space and the sample size tend simultaneously to the infinity. They showed that these risks ratios tend to values less than 1. Thus, the authors have assured the stability of minimaxity property of the James-Stein estimator and the positive-part of James-Stein estimator in the large values of the dimension of the parameter space p and the sample size n.

In this work, we deal with the model X ~ Np (9, a2Ip), where the parameter a2 is unknown and estimated by S2 (S2 ~ a2\^). Our aim is to estimate the unknown parameter 9 by shrinkage estimators deduced from the MLE. The adopted criterion to compare two estimators is the risk associated to the balanced loss function. The paper is organized as follows. In Section 1, we recall some preliminaries that are useful for our main results. In Section 2, we present the main results. Under the balanced loss function, we consider the general class of shrinkage estimators S* = (1 — y(S2, \\X||2)S2/ \\X\\2)X which containing the James-Stein estimator and we study the behavior of risks ratio of these estimators to the MLE. Thus we generalized some obtained

results in the our published papers for the case where the risks functions calculated relatively to the quadratic loss function.

1. Preliminaries

iixii 2

We recall that if X is a multivariate Gaussian random Np (6, a2Ip) in Rp, then —— — xp (A)

where xp (A) denotes the non-central chi-square distribution with p degrees of freedom and non-

62

centrality parameter A = . We also recall the following definition given in formula (1.2) by

2a2

Arnold [14]. It will be used to calculate the expectation of functions of a non-central chi-square law's variable.

Definition 1. Let U — xp (A) be non-central chi-square with p degrees of freedom and non-centrality parameter A. The density function of U is given by

^ e-2(|)k x(p/2)+k-ie-x/2

k! fTf

k=0 v 2

The right hand side (RHS) of this equality is none other than the formula

r / \ ° 2 V 2 > x - ■ ' e '

f(x) = E —k!— r(p + k)2(p/2)+k , 0 <x<

e-f (A)k

e 2 ( 2 ) X2

k! xp+2k,

k=0

where xp+2k is the density of the central x2 distribution with p + 2k degrees of freedom.

To this definition we deduce that if U ~ xp (A) , then for any function f : R+ —> R, xp (A) integrable, we have

E [f (U)] = Exl{X) [f (U)] =

= f f (x)x2 (A) dx =

" -f(A)k

e 2 —

■j r+

E

k=0

E

k=0

f (x)Xp+2k (0) dx f (x)Xp+2k dx

k!

dk ), (1)

where P (; dk) being the Poisson distribution of parameter A and xp+2k is the central chi-square distribution with p + 2k degrees of freedom.

Using the Definition 1 and the Lemma 1 in Benmansour and Hamdaoui [4], we deduce that if X - Np (6,a2Ip), then

a2 (p - 2+ № ) ^ K = a2 - 2 + 2^ ^ a2(p - 2)(p + ^l2) ' (2)

We recall the following Lemma given by Stein [15], that we will use often in the next.

r

+

r

+

Lemma 1. Let X be a real random variable and let f : R —> R be an indefinite

integral of the Lebesgue measurable function, f' essentially the derivative of f. Suppose also that E \f' (X)| < then

E\(^u)f (X)! = E (f' (X)).

For the next, assume that X — Np (9,a2Ip) where a2 is unknown and estimated by S2 (S2 — a2\n). Our aim is to estimate the unknown parameter 9 under the balanced loss function defined as, for any estimator S of 9:

L.(S, 9)= u\S — So\2 + (1 — u)\S — 9\\2,

where 0 < u < 1. We associate to this balanced loss function the risk function defined by

R.(S, 9) = E(LU(S, 9)).

In this model, it is clear that the MLE is S0 = X, its risk function is (1 — u)pa2. Indeed:

R.(X, 9) = uE(\\X — X\2) + (1 — u)E(\X — 9\\2).

As X - Np (9, a2Ip), then X— - Np (0, Ip), thus \X — 9\ - xp. Hence

E (\X — 9f)= E( a 2xp) = a2p.

It is well known that S0 is minimax and inadmissible for p > 3, thus any estimator dominates it is also minimax.

Now, we consider the shrinkage estimator

S* = (1 — y(S2, \\X\2)X2)X. (3)

( S2 \

In the special case when y(S2, \X\2) = a, (i.e. Sa = (1 — a^—^jX) where a is a real constant

\ X\ 2

may depend on n and p. It is easy to show that a sufficient condition for that Sa dominating the MLE, thus it is minimax, is that

2(p — 2)(1 — u)

0 < a < -^-.

n + 2

For a = a = —-—-^, we obtain the estimator that minimizes the risk function of the

n+2

estimators Sa, and its called the James-Sten estimator given by

S S ( a S2 \ Y( (1 — u)(p — 2) S2 \ X (4)

SjS = ^ = I1 — aWj X = I1--n + 2 jXr) X. (4)

Using the Definition 1 and the Lemma 1, one can prove that the risk function of SJS is

R. (Sjs, 9) = (1 — u)pa2 — (1 — u)2(p — 2)2-^-a2e( ) , (5)

where K ~ P

n + 2 \p - 2 + 2K

Ml

2a2

From the formula 5, it is trivial that the James-Stein estimator SJS dominate the MLE, thus it is minimax. Furthermore, the Theorem 4.1 given in Hamdaoui et al [13] show that

Ru(Sjs, 0) _ u + c n,p™ ^ Ru (X,0) _ 1 + c" (6)

Then one can deduce that the James-Stein estimators dominates the MLE, for the large values of n and p.

2. Main results

In the next we need the following Lemma that shows a explicit formula of the risk function of the estimator Sv given in (3), which helps us to compute the limit of risks ratio.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Lemma 2. Assume the estimator Sv given in (3). Then &v,js :_ Ru(Sv,0) - Ru(Sjs, 0) _

E ((d - <(S2, \\X||2))2X2 - 2d(d - <(S2, \\X||2))iXn^ +

\\X\\2 v ^ ,„ tt))\\x\2

+ 2(1 - u) x E ^(d - <(S2, \\X\\2))S2 - A(d - <(a2xl a2Xp+2(A)))^T^J ,

^Mv , _(1 - u)(P - 2) ^ _W\

2

where SJS _ 1 - d Il2 X, d _ --- and A _

V \\X\\2 J n + 2

Proof.

Ru(Sv, 0) _ uE(\\Sv - X\\2) + (1 - u)E(\\Sv - 0\\2) _

_ uE(\\Sv - Sjs + Sjs - X\\2) + (1 - u)E(\\SV - Sjs + Sjs - 0\\2) _ u{E (\\SV - Sjs\\2 + \\Sjs - X\\2 + 2{SV - Sjs, Sjs - X»}+ + (1 - u){E (\\SV - Sjs\\2 + \\Sjs - 0\\2 + 2{SV - Sjs, Sjs - 0))} _ _ Ru (Sjs, 0) + E(\\SV - Sjs\\2) + 2E({Sv - Sjs, Sjs - X))+ + 2(1 - u)E({Sv - Sjs, X - 0))"

As

2

\\X\\2

and

E({SV - Sjs, X - 0))_ E (j(d - <(S2, \\X\\2))^X, X - .

E((d - <(S2, \\X\\2))S2) - El {X, 0)(d - <(S2, \\X\2)h S2

\\X\\2

_ E((d - <(S2, \\X\\2))S2)-

- AE ^(d - <(a2xl a2X2p+2(.A)))XX^) "

(7)

E(\\SV - Sjs\\2) _ E ( (d - <(S2, \\X\\2))2) , (8)

(S2)2 \\X\2

E({SV - Sjs, Sjs - X)) _ -E ( d(d - <(S2, \\X\2))^ ) (9)

The last equality comes from the conditional expectation and the formula (2.7) given in Benman-sour and Mourid Benmansour and Mourid [16]. Using the formulas (7-9) and (10), we deduce the desired result. □

Theorem 1. Assume the estimator 5V given in (3), with y satisfies the conditions

- 2)

(H1) y >

n + 2

1 \

(H2) \d - y\ < g(S2) a.s, where E [(g2(S2))1+Y] = O for some y > 0, in the

neighborhood of +œ.

If lim -^L = c, then pa2

lim — .

Ru (X, 9) 1 + c

Proof. From (H2) we have

^*,JS < E ((d — y(S2, \\X\2))2 + 2d\d — y(S2, \\Xf^) + 2(1 — u)x

x E (d — y(S2, \\X\2)\S2 + X\d — y(a2xl a2xp+2(.mX^X)) ^ < E [g2{S^ +2dE (g^)^) +2(1 — u)E(g(S2)S2)+

+2(1—u)XE{g(S 2)

From the independence between \ X\ 2 and S2 and the holder inequality, we get

1

(SV,Q) _ u + c

< E 1+r ((¿(S2))2^) E1+Y ^(S2)2(^>) E (^ +

+ 2dE^ ((g(S2))2(1+Y)) E((S2)№) E (+ + 2(1 - u)E((g(S2))2(1+l)^ E2++) ((S2)№ ) + + 2(1 - u)XE2C+) ((g(S2))2(1+Y^ E((S2) W) ] .

As

eî^ = 4eî—^ \< 1 1

\\\XH2; a2 \p - 2 + 2KJ a2 p - 2

E E -U < 1

and

X2p+2W J \p + 2K J p - 2 (1 - u)(p - 2)

d

n+2

we obtain

A,

¡p,JS

<

4

E++Y ((g(S2

Ru (X,0)^ (1 - U)p(pp - 2)

8 i + E ((g(S

'r( n + 2(1 + y) )\ T + Y

r( n )

+

1 + 2y

'r(n + 4(1+7) )\ 2(1+Y) i( 2 + 1+27 ) \

r( n )

+

+ PE2++Y) ((g(S2))2(1+Y)) ^

1 + 2y

r(n + 2(1+y) )\ 2(1+7)

2))2(1+7n I 1( 2 + 1+2Y ) ]

r( n )

+

+

4 ||0||2

II II 17 o/-i _l

E((g(S2

1 + 2y

'P(n + 2(1+y) )\ 2(1+7) i( 2 + 1+27 ) \

a2(p - 2) pja2\ ) y r(n) j '

Now, from stirling's formula which expresses that in the neighborhood of we have

r(y +1) ~ V22nyv+1 e-y

and the fact that

we have

lim (l +

u^+to V nj

/r(n + ^A ++Y

I r(n) )

1 + 2y

'r(n + 4(1+7) )\ 2(1+7) i( 2 + 1+27 ) \

= 12+2+11 ■

r( n )

n2

" 2 + ÏT2Y + 1

and

1 + 2y

'r(n _l 2(1+7) ^ 2(1+y) i( 2 + 1+2y M n

V r( 2 )

Then, in the neighborhood of +œ we have

2 1 + 27

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

A

<p,js

<

4

Ru (X,0)^ (1 - w)p(p - 2) 8

E ++Y (g(S:

+p^yE ^ ((g(S p

n2

2+ Y + 11 +

n2 77 + ^^ + 1 +

2 1 + 2y

+ 4ETO ((g(S2))2(1+Y)j

It—i 5 e^ (<g(s!

n1

n + 1 +

2 1 + 2y

n1 2 + 1 + 27

Using the condition E((g2(S2))2(1+y)) = ol 2(1+7)J, then it exists M > 0 such that

1

lim „ vlJS^ ^ lim

i 1 n 2 ,

M ++Y- + - + 1 +

Ru (X,9) I (1 - w)p(p - 2) n2 \ 2 y

a

= e

2

1

4

+

8 „„ i 1 (n 2

M 2(1+y) - - +

p(n + 2)

4 i 1 n 1

+ - M 2(1+y) - - +--

p n V 2 1 + 2y

n \ 2 1 + 2y

+

+ 1 +

+

4

a2(p - 2) pa2

\\e\\2 1 (n 1

11 11 M 2(1 + 7)- - +

n \ 2 1 + 2y

+

from formula 6, we get

Ru (Sjs ,9) = u + c n,p^+x Ru (X, 9) ^ nRu (X, 9) 1 + c

lim

Ru (5V,

< lim

(11)

In the other hand

Ru(Sv,9) = uEly2(S2, \\X\\2)^- ) +(1 - u)E

(S2)2

and

E

\\X\\2

S2 2 1 - y(S2, \\X\\2)X - 9

1 - y(S2, \\X\\2)

\\X\\2

X9

E

p r / S 2 \ I2"

1 - y(S2, \\Xf),^ ) Xi - 9i

( P / S 2 \ 2 P P /

E{g(1 -^\2»w) X + g92-2£(1 -

\\X\\2

\\X\\2)

S2

S2 2

El (1 - y(S2, \\X\\2) S * x\\2

\\X\\2

\\X\\2 + 2a2K - 2[1 - y(S2, \\X\\2)

\\X\\2

S2 ^ p

Xi9i> =

\\X\\2

J^Xi9i \.

i=1

Using (b) of Lemma 3.1 in Hamdaoui and Benmansour [17], we obtain

E

1 - y(S2, \\X\\2)

S2

\\X\\2

X9

= a2E^(1 - y(a2x2n,a2x2p+2K)X+2K+

+ 2K - - ^à, a2xP+2K) =

= a2ElL(a2x2n,a2x2p+2K) - 1+ XK) 4+2K} +

xp+2K xp+2K

+ a2E p -

Using the conditional expectation we get

E<p -

(X2P+2K - 2K)2

Xp+2K

EE

(X2P+2K - 2K)2

xp+2K

4K2

p - Xp+2K - X2-+ 4K\K

xp+2K

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

4K2

= Eip - (p + 2K)--+4K ¡> =

p-2+2K

= Elp - 2 - (p - 2 + 2K) -

4K2

p-2+2K

+ 4K> =

2

0

2

From the hypotheses (H1), we deduce that

Ru(Sv,0) > u

(1 - u)(p - 2)2na2

Eip-2 -

1

(p - 2)2 p-2+2K

n+2

E

p-2+2K

+ (1 - u)a2E{p - 2 -

(p - 2)2 p - 2 + 2K

Using the last formula and the formula (2), we have

Ru (Sv,0) Ru (X,0)

>

u(p - 2)2n

p(n + 2) p - 2+ №

+ (1 - u)a2E J 1 - 2 - (p - 2)2-1-^^

l ^ (p - 2)(p + )

From the condition lim —^ _ c, we obtain

\ 0\

pa2

Ru (SV,0) > u + c

.p-m+TO Ru (X,0) > 1 + c"

(12)

The formulas 11 and 12 give the desired result. □

The following Proposition gives the same result as Theorem 1 for a particular shrinkage function <. Indeed, we will choose g in L2 and note in L2(1+y) but with the constraint that the function g is monotone non-increasing.

Proposition 1. Assume the estimator Sv given in (3), with < satisfies the condition

(H1) < >

(1 - u)1/2 (p - 2) n + 2 ,

(H2) \d - <\ ^ g(S2) a.s., where g is monotone non-increasing and E ((g2(S2))) _ O ^ ) in the neighborhood of +TO"

If lim _ c, then

p—+to pa2

Ru (SV,0) _ u + c Ru (X,0) 1 + c"

Proof. From (H2) we have

Av,js < E ( (d - <(S2, \\X \\2 ))2jSXP + 2d\d - <(S2, \\X \\2 )\\X\2 ) +2(1 - u)x

,(S2)2

\ X\ 2 22

x e(\d - <(S2, \\X\\2)\S2 + A\d - <(a2xl a2xp+2(A))\4^^ £

(S2)2 f.^f, ^ (S2)2

Xp+2(A)7

2))^ * + 2dE( g(S2)^ ) +2(1 - u)E(g(S2)S2)+

\\X\\2r V '\\X\\2

2, S2 A

< E( (g(^)) \X\\2

+2(1-)A^(S2) xp+^y;-

As g is monotone non-increasing, the covariance of two functions, one increasing and the other decreasing is negative and the fact that E\ 1 2 | _ -1 E (-^—| , we obtain

\ X\ 2

1

a2 \p - 2 + 2K J ' (1 - u)(p - 2)

Av,js < E((g(S2))2)a2n(n + 2)E ) + 2^+^E(g(S2))a2 x

n+2

1

)

2

x n(n + 2)E ( -1-- ) + 2n(1 - u)E(g(S2)) { a2 + A^ 1

p-2+2K

p + 2K,

Then

AV,JS , n(n + 2) j-,,, , e2^2

<

E((g(S2 ))2)E

l

r*(X,0) ^ p(i -u) ••-• ■■ ■ \x;+2K

+ ^E (g(S2)) + 2nA E (g(S2 ))E{^ p pa2 1 -"2

+

2n(p - 2) 2

E(g(S2))E

Ok)

+

From condition E((g(S2))2) = O and using the Schwarz inequality, when n is in the

neighbourhood of we obtain

E(g(S2)) < E1/2((g(S2))2) < vM-,

n

where M is a real strictly positive. Then, when n is in the neighbourhood of we have

aj , M e

l

R*(X,0) ^ p(1 - u) \p - 2 + 2K

+ 2VM + 2yM M2 E

a2 pa2 yp + 2K

+ 2(p - 2)VME

l

l

p - 2 + 2K

+

M

p(1 - u)\p - 2) \p + ME

l

+

2(p - 2)\M f p

yp- V \p +

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

+

2VM 2VM/||0||2^ (p + 2

p a2 pa2

I a H 2

kp + 2 + ^

m'

p—+to pa2

As lim

= c, then

A

V,JS

lim

p-+TO R* (X, 0)

^ 0,

thus

The proof of

lim R*(Sv,0) ^ lim R*(Sjs,0) = w + c

n,p^+TO R* (X, 0) ^ n,p^+TO R* (X,0) 1 + c

lim R* (SV,0) > u + c P-m+TO R* (X,0) > 1 + c'

is the same given in the Theorem 1.

p

p

p

1

p

p

1

2

Conclusion

In this work, we studied the estimation of the multivariate normal mean distribution X ~ Np (0, a2Ip) under the balanced loss function. We considered the class of estimators defined by Sv = (1 - ^(S2, yX||2)S2/||X||2) X which are not necessarily minimax, and containing the James-Stein estimator SJS and we interested to establish the sufficient conditions for that the estimators Sv dominates the MLE X in the case where the dimension of the parameter spaces p

and the sample size n are large. If the limit of the ratio \\9\\2 /pa2 is a constant c> 0 when p tends to infinity, we showed that the risks ratio Ru(S*, 9)/Ru(X, 9) tends to (u + c)/(1 + c) (0 < u < 1) when n and p tend simultaneously to infinity. Thus we ensured that the estimators S* which are not necessarily minimax, dominate the MLE X, even if the dimension of the parameter spaces p and the sample size n tend simultaneously to infinity. An extension of this work is to obtain the similar results in the case where the model has a symmetrical spherical distribution.

The authors would like to thank the editor and the referees for their comments and insightful suggestions, and careful reading of the manuscript. This work was supported by the Thematic Research Agency in Science and Technology (ATRST-Algeria).

References

[1] C.Stein, Inadmissibilty of the usual estimator for the mean of a multivariate normal distribution, Proc 3th Berkeley Symp, Math. Statist. Prob. Univ. of California Press. Berkeley, 1(1956), 197-206.

[2] W.James, C.Stein, Estimation of quadratic loss, Proc 4th Berkeley Symp, Math. Statist.Prob, Univ of California Press, Berkeley, 1(1961), 361-379.

[3] B.Efron, C.N.Morris, Stein's estimation rule and its competitors: An empirical Bayes approach, J. Amer. Statist. Assoc., 68(1973), 117-130 .

[4] D.Benmansour, A.Hamdaoui, Limit of the ratio of risks of James-Stein estimators with unknown variance, Far East J. Theo. Stat., 36(2011), no. 1, 31-53.

[5] G.Casella, J.T.Hwang, Limit expressions for the risk of the James-Stein estimators, Canad. J.Statist., 4(1982), 305-309.

[6] A.Benkhaled, A.Hamdaoui, General classes of shrinkage estimators for the multivariate normal mean with unknown variance: minimaxity and limit of risks ratios, Kragujevac J. Math, 46(2019), no. 2, 193-213.

[7] A.Hamdaoui, A.Benkhaled, N.Mezouar, Minimaxity and limits of risks ratios of shrinkage estimators of a multivariate normal mean in the bayesian case, Stat., Optim. Inf. Comput., 8(2020), 507-520.

[8] A.Zellner, Bayesian and non-Bayesian estimation using balanced loss functions, In: J.O.Berger, S.S.Gupta (eds.), Statistical Decision Theory and Methods, Vol. 7, 1994, Springer, New York, 337-390.

[9] H.Guikai, L.Qingguo, and Y.Shenghua, Risk Comparison of Improved Estimators in a Linear Regression Model with Multivariate t Errors under Balanced Loss Function, J. Appl. Math., 354(2014), 1-7.

[10] H.Karamikabir, M.Afshari, M.Arashi, Shrinkage estimation of non-negative mean vector with unknown covariance under balance loss, J. Inequal. Appl, (2018), 1-11.

[11] N.Sanjari Farsipour, A.Asgharzadeh, Estimation of a normal mean relative to balanced loss functions, Statistical Papers, 45(2004), 279-286.

[12] K.Selahattin, D.Issam, The optimal extended balanced loss function estimators, J. Comput. Appl. Math., 345(2019), 86-98.

[13] A.Hamdaoui, A.Benkhaled, M.Terbeche, On Minimaxity and limit of risks ratios of JamesStein estimator under the balanced loss function, Kragujevac J. Math., 47(2020), no. 3, 459-479.

[14] A.F.Steven, The theory of linear models and multivariate analysis, John Wiley and Sons, Inc., 1981, 9-10.

[15] C.Stein, Estimation of the mean of a multivariate normal distribution, Ann. Statis., 9(1981), no. 6, 1135-1151.

[16] D.Benmansour, T.Mourid, Etude d'une classe d'estimateurs avec retrecisseur de la moyenne d'une loi gaussienne, Ann. I.S.U.P, 51(2007), 83-106.

[17] A.Hamdaoui, D.Benmansour, Asymptotic properties of risks ratios of shrinkage estimators, Hacet. J. Math. Stat., 44(2015), no. 5, 1181-1195.

Пределы отношений рисков оценщиков усадки при сбалансированной функции потерь

Мекки Тербече

Университет наук и технологий, Мохамед Будиаф, Оран Лаборатория анализа и применения излучения, USTO-MB

Оран, Алжир

Абделькадер Бенхалед

Университет Туши, Мустафа Стамбули Лаборатория геоматики, экологии и окружающей среды, Университет Маскара

Тушь, Алжир

Абденур Хамдауи

Университет наук и технологий, Мохамед Будиаф, Оран Лаборатория статистики и случайных моделей (LSMA) Университета Тлемсена

Оран, Алжир

Аннотация. В этой статье мы изучаем оценку многомерного нормального среднего при сбалансированной функции потерь. Мы представляем здесь класс оценок усадки, который обобщает оценку Джеймса-Стейна, и мы заинтересованы в установлении асимптотического поведения отношений рисков этих оценок к оценкам максимального правдоподобия (ЫБК). Таким образом, в случае, когда размерность пространства параметров и размер выборки велики, мы определяем достаточные условия для того, чтобы приведенные ранее оценки были минимаксными.

Ключевые слова: сбалансированная функция потерь, оценка Джеймса-Стейна, многомерная гауссова случайная величина, нецентральное распределение хи-квадрат, оценки усадки.

i Надоели баннеры? Вы всегда можете отключить рекламу.