Научная статья на тему 'LAW OF LARGE NUMBERS FOR WEAKLY DEPENDENT RANDOM VARIABLES WITH VALUES IN [0, 1]'

LAW OF LARGE NUMBERS FOR WEAKLY DEPENDENT RANDOM VARIABLES WITH VALUES IN [0, 1] Текст научной статьи по специальности «Математика»

CC BY
35
5
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
LAW OF LARGE NUMBERS / MIXING SEQUENCE / 𝐷 [0 / 1] SPACE

Аннотация научной статьи по математике, автор научной работы — Sharipov Olimjon Shukurovich, Norjigitov Anvar Fayzullayevich

Limit theorems in Banach spaces are important, in particular, because of applications in functional data analysis. This paper is devoted to the law of large numbers for the random variables with values in the space [0, 1]. This space is not separable if we consider it with supremum norm and it is difficult to prove limit theorems in this space. The law of large numbers is well-studied for the sequences of independent [0, 1]-valued random variables. It is known that in the case of independent and identically distributed random variables with values in [0, 1] the existence of the first moment of the norm of random functions is a necessary and sufficient condition for the strong law of large numbers. The law of large numbers for the sequences of independent and not necessarily identically distributed random variables with values in [0, 1] were proved as well. Our main goal is to prove the law of large numbers for the weakly dependent random variables with values in the space [0, 1]. Namely, we consider the sequences of mixing random variables with values in [0, 1]. Mixing conditions for [0, 1]-valued random variables can be introduced in several ways. One can assume that random functions themselves satisfy mixing conditions. We consider a slightly different condition. In fact we assume that the finite dimensional projections of the 𝐷[0, 1]-valued random variables satisfy mixing condition. This is a weaker condition than assuming that random functions themselves satisfy mixing condition. In the paper the law of large numbers for 𝜌𝑚-mixing sequences of𝐷 [0, 1]-valued random variables are proved.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «LAW OF LARGE NUMBERS FOR WEAKLY DEPENDENT RANDOM VARIABLES WITH VALUES IN [0, 1]»

ISSN 2074-1871 Уфимский математический журнал. Том 13. № 4 (2021). С. 126-133.

LAW OF LARGE NUMBERS FOR WEAKLY DEPENDENT RANDOM VARIABLES WITH VALUES IN D [0,1]

O.Sh. SHARIPOV, A.F. NORJIGITOV

Abstract. Limit theorems in Banach spaces are important, in particular, because of applications in functional data analysis. This paper is devoted to the law of large numbers for the random variables with values in the space D [0,1]. This space is not separable if we consider it with supremum norm and it is difficult to prove limit theorems in this space. The law of large numbers is well-studied for the sequences of independent D [0,1]-valued random variables. It is known that in the case of independent and identically distributed random variables with values in D [0,1] the existence of the first moment of the norm of random functions is a necessary and sufficient condition for the strong law of large numbers. The law of large numbers for the sequences of independent and not necessarily identically distributed random variables with values in D [0,1] were proved as well. Our main goal is to prove the law of large numbers for the weakly dependent random variables with values in the space D [0,1]. Namely, we consider the sequences of mixing random variables with values in D [0,1]. Mixing conditions for D [0,1]-valued random variables can be introduced in several ways. One can assume that random functions themselves satisfy mixing conditions. We consider a slightly different condition. In fact we assume that the finite dimensional projections of the ^[0,1]-valued random variables satisfy mixing condition. This is a weaker condition than assuming that random functions themselves satisfy mixing condition. In the paper the law of large numbers for pm-mixing sequences of D [0,1]-valued random variables are proved.

Keywords: Law of large numbers, mixing sequence, D [0,1] space. Mathematics Subject Classification: 60F05, 60D05

1. Introduction

The law of large numbers for sequences of random variables with values in Banach spaces

has been investigated by many authors, see [l]-[8]. It is known that the validity of the law of

large numbers (and also of the strong law of large numbers) depends mainly on the geometry

of Banach spaces, see [1],[2], [5], [6] and references therein. The aim of this note is to prove the

law of large numbers for weakly dependent random variables with values in D [0,1], which the

space of all real-valued functions that are right continuous and have left limits equipped with

the norm ||x(i)|| = sup |x(i)|. We note that the space D [0,1] is not separable and does not te[o,i]

belong to the known types of Banach spaces, see [5].

O.Sh. Sharipov, A.F. Norjigitov, Law of large numbers for weakly dependent random variables with values in D [0, 1].

© Sharipov O.Sh., Norjigitov A.F. 2021.

The research of the first author was supported by the Collaborative Research Grant SFB 823 "Statistical modelling of nonlinear dynamic processes" (Germany). Submitted September 27, 2020.

Let {Xn(t),t G [0,1] ,n ^ 1} be a sequence of D [0,1]-valued random variables. We say that the sequence {Xn(t),t G [0,1] ,n ^ with EXk(t) = 0, satisfies the law of large numbers if

1 (Xi(i) + ... + Xn(t)) ^ 0 in probability

n

as n ^ to in D [0,1] and we say that the sequence satisfies the strong law of large numbers if the above convergence holds almost surely.

The laws of large numbers in D [0,1] for the sequences of independent random elements were studied in [9] [12], The following theorems were proved in [9]—[11].

Theorem 1.1. [9]. Let {Xk} be a sequence of independent and identically distributed random variables with, values in D [0,1]. Then {Xk} satisfies the strong law of large numbers in D [0,1] if and only if E ||X11| < to and EX1 = 0.

Theorem 1.2. [10]. Let {Xn} be a sequence of independent convex dense random elements in D [0,1] satisfying

supE ||Xra||r ^C,

n

where r > 1 and C is a constant. Then, almost surely,

(n n \

n~ , n~1 VEXk I

Z—/ Z—/ I

fc=1 fc=1 /

where d(x, y) is Skorokhod's metric.

Theorem 1.3. [11]. Let {Xk} be a sequence of independent random variables with values in D [0,1]. Suppose that there exist nondecreasing continuous functions p and ^ on [0,1], such that for all 0 ^ s ^ t ^ u ^ 1, the following conditions hold

EXl(t) < to for all te [0,1] , k = 1, 2,... 1 n

n^T.E {X (s) - X (i)|2 A |Xfc (t) - Xk (u)|2} ^ '2(u - s), k=1

J <p(x 2)dx < to, 1 n

{|Xfc(u) — Xk(S)|2} ^2(u - 8),

k=1

/<x

^(x~4)dx < to,

^ E HXk f

-~j-p-< to, for some, 1 ^ p ^ 2,

k=1

where a Ab = min {a, b} .Then {Xn(t),t G [0,1] ,n ^ 1} satisfies the law of large numbers in D [0,1].

Our aim is to establish the law of large numbers for mixing sequences. Mixing coefficients for a given sequence {Xn(t), t G [0,1], n ^ 1} of D [0,1]-valued random variables are defined

0

as following:

P(n) = sup ( lE{£ m{r! Ev)l : £ e L2(F*), V e L2(I%_k), k e tfl, (f - E^E1(v - EV)2 n+ J

(£ - EC )(V - EV)I fr,kt

pm (n) = supsuM -1.--- : £ e L2(Fl (m)), -q e L2(Fn++k(m))

Rm [E2 (£ - EQ2E2 (^ - Evj)2

where is a a-field generated by random processes Xa(t),... ,Xb(t), F^(m) is a a-field generated by random processes ]\m Xa(t),..., nm Xb(t) and ]\m : D [0,1] ^ Rm is a projection operator from D [0,1] to Rm, L2(F) is a space of all square integrable and F-measurable random variables.

We say that {Xn(t), t e [0,1], n ^ 1} is a pTO-mixing sequence if pm(n) ^ 0 as n ^ for each m = 1, 2,... and we say that {Xn(t), t e [0,1] , n ^ 1} is p-mixing if p(n) ^ 0 as n ^ ro. A p-mixing sequence is always pTO-mixing but in general, pTO-mixing sequence may not be p-mixing.

We note that we in fact require that the m-dimensional projections of the sequence {Xn(t), t e [0,1], n ^ 1} which can be denoted by {Xn(t\),... ,Xn(tm),n ^ 1}, satisfy the mixing conditions.

2. Main results The main aim of this work is to prove the following theorems.

Theorem 2.1. Let {Xn(t), t e [0,1], n ^ 1} be a .sequence of random variables with values in D [0,1]. Suppose that there exist a nondeereasing continuous function H (t) on [0,1] such, that for all 0 ^ s ^ u ^ 1, n ^ 1 and so me e > 0 the following conditions hold:

EXk(t) = 0,E iXk(t)l2 < A for some A> 0, t e [0,1], k = 1, 2,..., 1 n

-lE E(Xk(u) - Xk(s))2 < (H («) - H (s)) loS-(3+£) (1 + (H (u) - H (s))-1) , n k=1

<x

(2k) < <X>, rn = 1, 2,...

k=1

Then {Xn(t), t e [0,1] , n ^ 1} satisfies the law of large numbers in D [0,1].

Theorem 2.2. Let {Xn(t), t e [0,1], n ^ 1} be a sequence of random variables with values in D [0,1]. Suppose that there exist a nondeereasing continuous function H (t) on [0,1] such, that for all 0 ^ s ^ u ^ 1, n ^ 1 and so me e > 0 the following conditions hold:

EXk(t) = 0, E lXk(t)l2+e < A, for some A> 0, t e [0,1], k = 1, 2,...,

-4+f max ElXk(u) - Xk(s)|2+£ ^ (H (u) - H (s))\og-(3+2s) (1 + (H (u) - H (s))-1) ,

{2k) < = 1, 2,...

k=1

Then {Xn(t), t e [0,1] , n ^ 1} satisfies the law of large number-s in D [0,1]. The following corollaries are immediate consequences of Theorems 2.1, 2.2.

Corollary 2.1. Let [Xn(t], te [0,1] , n ^ 1} be a sequence of random variables with values in D [0,1]. Suppose that there exist a nondecreasing continuous function H (t) on [0,1] such, that for all 0 ^ s ^ u ^ 1 and some £ > 0 the following conditions hold:

EXk (t) = 0, E \Xk (t)\2 < A for some A> 0, te [0,1], k = 1, 2,...,

u) Xk(s))2 < (H (u) H (s)) loff-(3+£) Î1 + (H (u) " ' '

E (Xk(u) - Xk(a))2 ^ (H (u) - H (s)) log-(3+£) (1 + (H (u) - H (s))-1)

<x

J>m (2k) < to , m = 1, 2,...

k=1

Then [Xn(t), t e [0,1] , n ^ 1} satisfies the law of large numbers in D [0,1].

Corollary 2.2. Let {Xn(t), t e [0,1] , n ^ 1} be a sequence of random variables with values in D [0,1]. Suppose that there exist a nondecreasing continuous function H (t) on [0,1] such, that for all 0 ^ s ^ u ^ 1 and some e > 0 the following conditions hold:

EXk (i) = 0, E \Xk (t)\2+£ < A for some A> 0, te [0,1], k = 1, 2,...,

E\Xk(u) - Xk(s) \2+ ^ (H (u) - H (s)) log-(3+2£) (1 + (H (u) - H (s))-1) ,

(2k) < to, m = 1, 2,...

k=1

Then {Xn(i), t e [0,1] , n ^ 1} satisfies the law of large numbers in D [0,1].

3. Proof of results We are going to prove the weak convergence Sn(t) ^ 0 as n ^ to, where

n n

Sn(t) = ~Y^Xk (t). n

k=1

This will imply the convergence in probability since in the limit we have a degenerate distribution, We employ the approaches from [14]-[15] used there in the proof of the central limit theorem.

First let us prove that the family of distributions PSn is dense. The proof is based on the following lemmata.

Lemma 3.1. [14]. Let X1(t),X2(t),... ,Xn(t),... be a random variables with values in D [0,1]. Assume that there exist nondecreasing continuous function H on [0,1] and positive numbers ^,0,6 such, that for all X > 0 and 0 ^s^i^u ^ 1

P (\Xn (t) - Xn (s)\ A \Xn (u) - Xn (t)\ > X) ^ CX9211+1+e (H(u) - H(s))

where gp(u) = u \logu\-p, p > 0. Then the family of probability measures PSn is dense.

Lemma 3.2. [13]. Let {Xi,i ^ 1} be a sequence of real-valued random variables with p-mixing and for some q^ 2

n

EXi = 0, E \Xi\q < to, ^p2 (2k) < to.

k=1

Then there exists a constant K such that the inequality

E \Xi + ... + Xn\q ^ K (nq/2 max (EX2)q/2 + n max EXf]

V l^i^n l^i^n J

holds true.

Proof of Theorem 2.1. The proof is based on Lemma 3,1, It is sufficient to show that P (|Sn (t) - Sn (s)| A ISn (u) - Sn (t)l > X) ^P (|5n (t) - Sn (s)|2 A ISn (u) - Sn (t)i2 > A2)

^P (|5n (t) - Sn (s)| |S'n (u) - Sn m > A2) ^CX-293+s (H (u) - H (s))

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

where A e (0,1], 0 ^ s ^ t ^ u ^ 1, e> 0.

Note that we can assume H (u) - H(s) ^ In what follows, by C we denote various constants, possibly depending on different parameters, that may be different even in the same chain of inequalities. We have

J = ISn (t) - Sn (s)||^n (u) - Sn (t^

1

n

E (Xk (f) - Xk (S))

c

k=1

1

n

1

n

E (Xk(U) - Xk(t))

E (Xk(t) - Xk(s))

k=1

k=1

+ c

n

E (Xk(u) - Xk(t))

k=1

Jl + J..

In the last line, we have used the following notation:

2

1

Ji = C

n

J2(Xk(t) - Xk(s))

k=1

J2 = C

n

E (Xk(u) - Xk(t))

k=1

P (J > A2) ^ P (ji > 1 A2) + P^J2 > 1 A2)

We have

2A2)+p (J2 > 2

Let us estimate each of these terms separately. Using Markov inequality and Lemma 3,2, we get

n 2

E (Xk(t) - Xk(s))

n2

k=1

> 2

^2CX-2\ E n2

E (Xk(t) - Xk(s))

k=1

^2CX-2-J2 E ^k(t) - Xk(s)|

k=1

^2CA-2 (H(t) - H(s))log-(3+s) (1 + (H(t) - H(s))-1) . In the same way for J2 we get

p(h >1 x) =P

n2

E(^k(U) - Xk(t)) k=1

> 2

^2CX-2 (H(u) - H(t))log-(3+e) (1 + (H(u) - H(t))-1)

Hence,

P(J > X2) ^ 4CX-2 (H(u) - H(s)) log-(3+£) (1 + (H(u) - H(s))-1) .

By the assumptions of Theorem 2,1 and by the inequality

log-1 (1 + (H (u) - H (s))-1) ^ 2 |log (H (u) - H (s)) |

1

2

2

1

2

1

2

2

2

for H (u) - H (s) ^ 4 we have

P ((\Sn (t) - Sn (s)\ A 1) (Sn (u) - Sn (t)) > X2)

^ 4 CX-2 (H(u) - H(s)) log-(3+£) (1 + (H(u) - H(s))-1)

^ 4CX-293+£ (H(u) - H(s)).

In order to complete the proof of the theorem, it remains to prove the convergence of finite dimensional distributions of Sn(t). In view of Cramer-Wold theorem [16], to establish this, it is sufficient to prove the law of large numbers for

yn(t i,..., tk ) = PiXn(t i),

=1

for each pi e R, i = 1,2,... and t]_,..., t k e [0,1].

The sequence {yn(t 1,..., tk), n ^ 1} satisfies the pTO-mixing condition and bv the Chebvshev inequality and the assumptions of Theorem 2.1 and Lemma 3.2 we have

2

P

(

1

n

Yyi(t u tk)

=1

)

> £ «

1

n2 2

E

i( 1, . . . , k)

=1

« ^ o,

n2 2

for all ..., tk e [0,1], e > 0 k =1, 2,... with some constant C. The proof is complete. □

Proof of Theorem 2.2. We follow the lines of the previous proof. It follows from Lemma 3.1 that it is sufficient to prove

P (\Sn (t) -Sn (s)\ A \Sn (u) -Sn (t)\ > X)

(2+s 2+s „ . \

\Sn (t ) - Sn (s)\ \Sn (u) -Sn (t)\ ^ X2+s) ^ CX-(2+£)g3+2s (H (u) -H (s)),

where X e (0,1], 0 ^ s ^ t ^ u ^ 1, e > 0.

We note that we can assume that H (u) - H(s) ^ We have:

J = \Sn (t) - Sn (s)\^ \Sn (u) - Sn (t)\^

« C

k=1

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

1

n

2+s 2

J2(Xk(t) -Xk(s)) 1

n

£ (Xk (t) -Xk (s))

n

£ (Xk (u) -Xk (t))

k=1

2+s 2

k=1

2+£

+ C

C

1

n

£ (Xk (t) -Xk (s))

k=1

2+£

+ C

n

n

£ (Xk (u) -Xk (t))

k=1

£ (Xk (u) -Xk (t))

k=1

2+£

2+£

J1 + J2.

Here we have used the following notation:

J1 = C

1

n

£ (Xk (t) -Xk (s))

k=1

2+£

J2 = C

n

£ (Xk (u) -Xk (t))

k=1

We get:

P (J ^ a2+£) « P (j1 > 1a2+£^ + P (j2 > !a2+)

1

1

1

1

We shall estimate each of these terms separately. Using Markov inequality and Lemma 3,2, we obtain

n 2+£

P » 1 A2«)

P

1

n

2+s

J2(Xk(t) - Xk(s))

k=1

»2 ^

< 2CX-(2+e)E

n

2+S

E (Xk (f) - Xk (*))

k=1

2+e

^ 4CX-(2+£) (H(t) - H(s))log-(3+2e) (1 + (H(t) - H(s))-1) . In the same way, for J2 we get

P (j2 > 1 X2+j ^ 4CX-(2+e) (H(u) - H(t)) log-(3+2£) (1 + (H(u) - H(t))-1) ,

P (J > X2+e) ^ 8CX-(2+e) (H(u) - H(s))log-(3+2e) (1 + (H(u) - H(s))-1) . By the assumptions of Theorem 2,2 and by the inequality

log-1 (1 + (H (u) - H (s))-1) ^ 2 ^og (H (u) - H (s))-for H (u) - H (s) ^ as in the proof of previous theorem, we find:

(2+s 2+s , , \

^n (t) - Sn (s)| ^ ^n (u) - Sn m ^ > X2+Sj ^ 4CX-(2+£) (H(u) - H(s))log-(3+2e) (1 + (H(u) - H(s))-1) ^ 4CX-(2+e)93+2s (H(u) - H(s)).

This proves the required density. The convergence of finite-dimensional distributions of Sn (t)

follows from Cramer-Wold theorem [16] as in the proof of Theorem 2,1, The proof is complete,

Acknowledgments The authors thank an anonymous referee for a valuable report.

REFERENCES

1. D. Li, Y. Qi and A. Rosalskv. A refinement of the Kolmogorov-Marcinkiewicz-Zygmund strong law of large numbers // J. Theor. Prob. 24:4, 1130-1156 (2011).

2. A. De Acosta. Inequalities for B-valued random vectors with applications to the law of large numbers // Ann. Prob. 9:1, 157-161 (1981).

3. F. Hechner and B. Heinkel. The Marcinkiewicz-Zygmund LLN in Banach spaces. A generalized martingale approach //J- Theor. Prob. 23:2, 509-522 (2010).

4. J. Hoffmann-Jorgensen and G. Pisier. The law of large numbers and the central limit theorem in Banach spaces // Ann. Prob. 4:4, 587-599 (1976).

5. M. Ledoux and M. Talagrand. Probability in Banach spaces: Isoperimetry and processes. Springer, Berlin (1991).

6. G. Pisier. Probabilistic methods in the geometry of Banach spaces // In: "Probability and Analysis". Springer, Berlin. 1206, 167-241 (1986).

7. J. Rosinski. Remarks on Banach spaces of stable type // Prob. Math. Stat. 1:1, 67-71 (1980).

8. R.L. Taylor. Stochastic convergence of weighted sums of random elements in linear spaces. Springer, Berlin (1978).

9. R. Ranga Rao. The law of large numbers for D [0,1]-valued random variables // Theor. Prob. Appl. 8:1, 70-74 (1963).

10. P.Z. Daifer and R.L. Taylor. Laws of large numbers for D [0,1] // Ann. Prob. 7:1, 85-95 (1979).

11. P. Bezandrv. Laws of large numbers for D [0,1] // Stat. Prob. Lett. 76:10, 981-985 (2006).

12. J. Kuelbs and J. Zinn. Some stability results for vector valued random variables // Ann. Prob. 7:1, 75-84 (1979).

13. Q. Shao. Maximal inequalities for partial sums of p-mixing sequences // Ann. Prob. 23:2, 948-965 (1995).

14. M. Bloznelis and V. Paulauskas. On the central limit theorem in the space D [0,1] // Stat. Prob. Lett. 17:2, 105-111 (1993).

15. V. Paulauskas and Ch. Stieve. On the central limit theorem in D [0,1} and ^([0,1] ,H) // Liet. Matem. Rink. 30, 567-579 (1990).

16. P. Billingslev. Convergence of probability measures. Wiley, New York (1968).

Olimjon Shukurovieh Sharipov, National University of Uzbekistan named after Mirzo Ulugbek, 100174 University street 4, Almazar district, Tashkent, Uzbekistan E-mail: osharipovSyahoo. com

Anvar Fayzullavevieh Norjigitov,

V.I. Romanovskiv Institute of Mathematics,

Academy of Sciences of Uzbekistan,

100174 University street 4b,

Almazar district, Tashkent, Uzbekistan

E-mail: anvar2383@mail. ru

i Надоели баннеры? Вы всегда можете отключить рекламу.