Научная статья на тему 'ON AN INTERNAL DEPENDENCE OF SIMULTANEOUS MEASUREMENTS'

ON AN INTERNAL DEPENDENCE OF SIMULTANEOUS MEASUREMENTS Текст научной статьи по специальности «Математика»

CC BY
0
0
i Надоели баннеры? Вы всегда можете отключить рекламу.
Область наук
Ключевые слова
EPR thought experiment / Aspect’s optical version / Informational dependence / Bell’s inequality

Аннотация научной статьи по математике, автор научной работы — Valentin Vankov Iliev

In this paper we show that there exists an internal dependence of the simultaneous measurements made by the two pairs of linear polarizers operated in each leg of the apparatus in Aspect’s version of Einstein-Podolsky-Rosen Gedankenexperiment. The corresponding Shannon-Kolmogorov’s information flow linking a polarizer from one leg to a polarizer from the other leg is proportional to the absolute value of this function of dependence. It turns out that if Bell’s inequality is violated, then this information flow is strictly positive, that is, the experiment performed at one leg is informationally dependent on the experiment at the other leg. By throwing out the sign of absolute value, we define the signed information flow linking a polarizer from one leg to a polarizer from the other leg which, in turn, reproduces the probabilities of the four outcomes of the simultaneous measurements, predicted by quantum mechanics. We make an attempt to illustrate the seeming random relation between the total information flow, the total signed information flow, and the violation of Bell’s inequality in terms of a kind of uncertainty principle.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «ON AN INTERNAL DEPENDENCE OF SIMULTANEOUS MEASUREMENTS»

Valentin V. Iliev RT&A, No 2 (78)

DEPENDENCE OF SIMULTANEOUS MEASUREMENTS Volume 19, June, 2024

ON AN INTERNAL DEPENDENCE OF SIMULTANEOUS MEASUREMENTS

Valentin Vankov Iliev •

Institute of Mathematics and Informatics Bulgarian Academy of Sciences Sofia, Bulgaria viliev@math.bas.bg

Abstract

In this paper we show that there exists an internal dependence of the simultaneous measurements made by the two pairs of linear polarizers operated in each leg of the apparatus in Aspect's version ofEinstein-Podolsky-Rosen Gedankenexperiment. The corresponding Shannon-Kolmogorov's information flow linking a polarizer from one leg to a polarizer from the other leg is proportional to the absolute value of this function of dependence. It turns out that if Bell's inequality is violated, then this information flow is strictly positive, that is, the experiment performed at one leg is informationally dependent on the experiment at the other leg. By throwing out the sign of absolute value, we define the signed information flow linking a polarizer from one leg to a polarizer from the other leg which, in turn, reproduces the probabilities of the four outcomes of the simultaneous measurements, predicted by quantum mechanics. We make an attempt to illustrate the seeming random relation between the total information flow, the total signed information flow, and the violation of Bell's inequality in terms of a kind of uncertainty principle.

Keywords: EPR thought experiment, Aspect's optical version, Informational dependence, Bell's inequality.

1. Introduction, Notation 1.1. Introduction

In the context of the bipartite quantum system that describes Aspect's optical version of Einstein-Podolsky-Rosen Gedankenexperiment (see [1] and [5]), we consider the pairs of linear polarizers operated in each leg of the apparatus as pairs of self-adjoined linear operators

A,. = ( COs * sin * ) , Bv. = Av., r' \ sin * — cos *i J j j

where *, Vj € [0, n], i, j = 1,2, are the angles of the polarizers. Note that each pair has a time switch which interchanges polarizers, the corresponding time being shorter than the time necessary for a light signal to travel from one of the pairs of polarizers to the other (Einstein locality assumption for independence).

Each pair of operators A*1, A*2 and BV1, BV2 acts on the state space of the corresponding quantum subsystem (a unitary plane). By tensoring with the unit operator on the other plane, we obtain two pairs of self-adjoined linear operators A*1, A*2 and BV1, BV2 with spectre {1, —1} on the state space of the whole quantum system (tensor product of the two unitary planes). Moreover, for each i, j = 1,2 the operators A* and BV. commute because the state space of the

whole quantum system has an orthonormal frame consisting of eigenvectors of both operators. In this case the corresponding measurements are said to be simultaneous.

In accord with the axiom of quantum mechanics about the observables, after fixing initial state we can consider the members of this frame as outcomes of a sample space with probability assignment consisting of probabilities predicted by this axiom. Moreover, with an abuse of the language, we can also consider the operators as random variables with range {1, -1} on this sample space. Under the condition that the singlet state is initial, any one of these random variables has probability distribution (j). Moreover, if p E {pi, p2} and v E {v1,v2}, then

pr((A = 1) n (Bv = 1)) = pr((A = -1) n (Bv = -1)) = 2 sin2 (, pr((A = 1) n (Bv = -1)) = pr((A = -1) n (Bv = 1)) = 2 cos2 (.

Therefore, the product of random variables ApBv has probability distribution (sin2 (,cos2 ( and expected value E(ApBv) = - cos(p - v).

On the other hand, the joint experiment (see [7, Part I, Section 6]) of the binary trials Ap = (Ap = 1) U (Ap = -1) and Bv = (Bv = 1) U (Bv = -1) produces the probability distribution

1 p - v\ 1 2 (p - v\ 1 2 (p - v\ 1 . 2 (p - v ^ sin2 r— , - cos2 — , - cos2 — , - sin2 1

2 \ 2/2 \ 2/2 \ 2/2 \ 2

with Boltzmann-Shannon entropy E(9^,V), where E(9) = —29 ln 9 — 2(2 — 9) ln(2 — 9) and 9^,V = 1 sin2 (. We extend the function E(9), 9 E (0, 2), (see [6, 4.1,5.1]) as continuous on the closed interval [0, 2 ].

By modifying the entropy function E(9), we obtain the strictly increasing degree of dependence function e: [0,1 ] ^ [—1,1], which mimics the regression coefficient (see [6, 5.2]).

It turns out that the average quantity of information I(A^, BV) (see [3, §1]) of one of the experiments A^ and BV, relative to the other can be found by the formula I(A^, BV) = |e(9^,V) | ln 2. We can consider I(A^, BV) as a measure of the flow carrying information between these two binary trials (see [6, 5.3]). Since s is an invertible function, the corresponding signed information flow I(s)(ApBv)(9) = e(9) ln2 replicates the probability distribution (2) produced by quantum mechanics.

In terms of Aspect's experiment, the sum I (A, B) = £2,j I , BV.) (called total information flow) can be thought about as a measure of the flow carrying information between the two pairs of polarizers. In his paper [2] John Bell deduced under the assumptions of "locality" and "realism" that if measurements are performed independently (Einstein locality assumption for independence) on the two separated particles (photons in Aspect's experiment) of an entangled pair, then the assumption that the outcomes depend upon "hidden variables" implies constraint condition called Bell's inequality (see Subsection 4.1). It comes out that if Bell's inequality is violated, then the total information flow is strictly positive. In other words, in this case there exists an informational dependence between the two legs of apparatus.

In the end of the paper we discuss the relation between the information flow I(A, B) and the violation of Bell's inequality. Using Examples 1 and the Java program from the link that can be found there, we note that this relation is subject to a kind of uncertainty principle.

1.2. Notation

H: 2-dimensional unitary space with inner product (x|y) which is linear in the second slot and anti-linear in the first slot; I = Ih : the identity linear operator on H;

H02 = H 0 H: the unitary tensor square with inner product (x\ 0 x2 |yi 0 y2) = (x\ |yi) (x2|y2);

: the unit sphere in H02; Spec(A): the real spectre of a self-adjoined linear operator A on H with trace zero, having the form Spec(A) = {A^A), a2A)}, A^A) + a2A) = 0;

u(A) = {u^A),u2A)}: the orthonormal frame for H, formed by the corresponding eigenvectors of A;

H(A): the eigenspaces Cu(A) of A, i = 1,2.

2. Self-Adjoint Operators on H

2.1. Two Special Commuting Operators

We fix an orthonormal frame h = {hi, h2} for H and identify the self-adjoined operators with their matrices with respect to h. For any p £ [0, n] we denote by Ap the self-adjoined operator

cos p sin p sin p — cos p

We have A(Ap) = 1, A(A) = —1, and

uiA ) = (cos p )h1 + (sin p )h2, u2A ) = (— sin p )h1 + (cos p )h2.

For any V € [0, n] we set BV = AV.

Note that {h1 ® h1,hi ® h2,h2 ® hi,h2 ® h2} and m(a*) ® m(Bv) = {m(A*) ® m(Bv), m(A*) <g>

u(Bv), u(Ap) 0 u1Bv), u(Ap) 0 u2Bv)} are orthonormal frames for H02.

Let us set Ap = Ap 0 I, Bv = I 0 Bv. It is a straightforward check that the last two linear

operators on H02 are also self-adjoined with A(Ap) = a[Bv) = 1, A(Ap) = A2Bv) = —1, the A(Ap)-

eigenspace H(Ap) = H(Ap) 0 H has orthonormal frame {u(Ap) 0 u1Bv), u(Ap) 0 u(Bv)}, and

the A((Bv)-eigenspace hJBv) = H 0 h( Bv) has orthonormal frame {u(Ap) 0 u(Bv), u(Ap) 0 u(Bv)}, i = 1,2. Since u

(Ap)

0 u(Bv) is an orthonornal frame of H02 consisting of eigenvectors of both Ap and Bv, then the last two operators commute.

Let ty £ U(2) and let S(ty; Ap,Bv) be the sample space with set of outcomes u(Ap) 0

u(Bv) = {u(Ap) 0 u(Bv), u(Ap) 0 u2Bv), u(Ap) 0 u1Bv), u(Ap) 0 u2Bv)} and probability assignment {p11,p12,p21,p22} with pij = |(u(Ap) 0 u(Bv)|ty)|2, i, j = 1,2. With an abuse of the language, we consider the observable Ap as a random variable Ap: u(Ap) 0 u(Bv) — R, Ap(u(Ap) 0 u(Bv)) = A(Ap), Ap(u(Ap) 0 u(Bv)) = A(Ap), j = 1,2, on the sample space Ap,Bv) with probability distribution pAp(a(A)) = |(u(Ap) 0 u1Bv) |ty)|2 + |(u(Ap) 0 u2Bv) |ty)|2, i = 1,2, and pAp (A) = 0 for A / Spec(Ap). Identifying the event {u(Ap) 0 u(Bv), u(Ap) 0 u{Bv)} with the "event" Ap = A(Ap), wehavepr(Ap = A(Ap)) = |(u(Ap) 0 u(Bv) |ty)|2 + |(u(Ap) 0 u2Bv) |ty)|2, i = 1,2. We also consider the observable Bv as a random variable Bv: u

(Ap)

0 u(Bv) — R, Bv(uj p) 0

u(Bv)) = a(Bv), Bv(ujjAp) 0 u2Bv)) = A2Bv), j = 1,2, on the sample space S(ty; Ap,Bv) with probability distribution pBv(a(A)) = |(u{Ap) 0 u(Bv)|ty)|2 + |(u(Ap) 0 u(Bv)|ty)|2, i = 1,2, and pBv (A) = 0 for A / Spec(Bv). Identifying the event {u(Ap) 0 u(Bv), u(Ap) 0 u(Bv)} with the "event" Bv = A((Bv), we have pr(Bv = A^v)) = |(u(Ap) 0 u(Bv) |ty)|2 + |(u(Ap) 0 u(Bv) |ty)|2, i = 1,2.

In particular, let us set ty = ^ (h1 0 h2 — h2 0 h1). We have

pr(A = a(a )) = 2 |(u(A) |h1 )(u1Bv) |h2) — (u(A) |h2 )(u1Bv) |h1 )|2+

1 |(u(A) |h1 )(u2Bv) |h2) — (u( A) |h2)(uiBv) |h1)|2

and

pr(Bv = a( Bv )) = 2 |(u(A) |h1 )(u(Bv) |h2) — (u(A) |h2)(ufV) |h1 )|2 +

2 |(u(A) |h1 )(ufV) |h2) — (u( A ) |h2)(ufv) |h1 )|2.

Taking into account the form of the eigenvectors of the matrices A^ and BV, we obtain

pr(A, = A(A)) = pr(Bv = a( Bv)) = 2, i, j = 1,2.

We identify the intersection (AH = A(A)) n (Bv = A;(Bv)) with the event {u(Af) 0 u(Bv)}, i, j = 1,2, in the sample space S(ty; A^, BV) and obtain

pr((A = A(A)) n (Bv = A(Bv))) =

2 |(u(A) |h1 )(u(Bv) |h2) — (u(^) |h2)(u(Bv) |h1 )|2.

In particular, we have

pr((A = a(A)) n (Bv = a(Bv))) = 2 sin2 (^) ,

pr((A = )) n (Bv = a2Bv))) = 2 cos2 (^) , pr((A = a(A)) n (Bv = a1Bv))) = 2 cos2 (^) , pr((A = a(A)) n (Bv = a2Bv))) = 2 sin2 (^) .

The random variable A^BV has probability distribution

pABv (1) = sin2 (^) , pafBv (—1) = cos2 (^) ,

and pafbv (A) = 0 for A = ±1. The expected value of this random variable is E(A^BV) = — cos(^ — v).

3. Entropy and Degree of Dependence 3.1. Entropy

Now, we combine the terminology and notation of this paper with those of [6]. Let us set

A = (Ap = A(Ap)), B = (Bv = A1Bv)), Ac = (Ap = A(Ap)), Bc = (Bv = a2Bv)). a = pr(A) = 2, P = pr(B) = 2.

The pair (A, B) of events in the sample space S(ty; A, B) produces an experiment

J = (A n B) U (A n Bc) U (Ac n B) U (Ac n Bc) (1)

(cf. [3, I,§5]) and the probabilities of its results:

& = pr(A n B) = 2 sin2 (^ , £2 = pr(A n Bc) = 2 cos2 (,

£3 = pr(Ac n B) = 2 cos2 (, £4 = pr(Ac n Bc) = 2 sin2 (.

(2)

The probability distribution (£1, £2, £3, £4) satisfies the linear system [6, 4.1. (3)] whose solutions form a straight line with parametric representation £1 = 0, £2 = 1 — 0, £3 = 1 — 0, £4 = 0 in the hyperplane £1 + £2 + £3 + £4 = 1. Note that the parameter 0 = £1 runs within the closed interval [0,1 ]. The entropy of (£1,£2,£3,£4) is E(0) = — E4=1 £k(0) ln(£k(0)) = —20 ln0 — 2(2 — 0) ln(2 — 0) and the function E(0) can be extended as continuous on the interval [0, 2 ]. It strictly increases on the interval [0, |], strictly decreases on the interval [4,1 ] and has a global maximum at 0 = In particular, max0£[0 1 ] E(0) = E(4) = 2ln2. Since min0£[0 1 ] E(0) = E(0) = ln2 =

E( 1) = min0£[ 1 1 ] E(0), we obtain min0£[0 1 ] E(0) = ln2.

3.2. Degree of Dependence

It is more useful to modify the entropy function, thus obtaining the strictly increasing degree of dependence function e: [0,2 ] — [—1,1],

e(0)

_EiikM if0< 0< 1

E(4)—E(0) if0 - 0 - 4

E( 1 )-E(0) if 1 < 0< 1

e(4)-E( 1) if4 - 0 - 2.

Taking into account the values of extrema of entropy function, we obtain

e(0)-) —2 + S if0 < 0 < 1

C(°M 2 — m if 1 < 0 < 2.

The events A and B are independent exactly when the entropy is maximal (equal to 2 ln 2), that is, when e(0) = 0 and this, in turn, is equivalent to the equality p — v| = f. We have e(0) = —1 or e(0) = 1 if and only if p — v| = 0 or p — v| = n, respectively, and in these two cases the entropy is minimal and equal to ln 2. Now, let, in addition, assume that A and B are events in a sample space with equally likely outcomes. If e(0) = —1, then one of A and B is a subset of the complement of the other (maximal negative dependence), and if e(0) = 1 one of them is a subset of the other (maximal positive dependence).

3.3. The Information Flow

The experiment J from (1) is the joint experiment (see [7, Part I, Section 6]) of two simple binary trials: Ap = A U Ac and Bv = B U Bc with pr( A) = pr(B) = The average quantity of information of one of the experiments Ap and Bv, relative to the other, (see [3, §1]), is defined in this particular case

by the formula I(A^,Bv)(0) = £1(0) ln4^(0) + &(0) ln4&(0) + &(0) ]n4&(0) + &(0) ln4&(0). The above notation is correct since the interchanges of A and Ac or B and Bc causes permutations of &'s. Thus, we obtain I(A^,Bv)(0) = max0G[0 i] E(0) — E(0). Now, the definition of the degree

function e(0) yields immediately I(A^,Bv)(0) = |e(0)| ln2 for 0 G [0, 2].

Translating into the language of information theory, we have e(0) = —1 or e(0) = 1 if and only if I (A, Bv )(0 ) = max0<T< i I (Av, Bv )(t) = ln2. Finally, we have e(0 ) = 0 if and only if I (A, Bv )(0) = 0, and under this condition the experiments A^ and Bv are said to be

informationally independent.

3.4. The Signed Information Flow

Let us set I(s)(A}l, Bv)(0) = e(0 ) ln2 for 0 G [0,1 ] and call this quantity average quantity of signed

information of one of the events AF = A(Al ) and Bv = â1Bv ), relative to the other. Then I (A^, Bv) = |I(s)(Ai,Bv)| and since the function e is invertible, we obtain 0 = e—1 (I(s)(A^,Bv)). In particular, the value of the signed information flow I(s)(Afl, Bv ) reproduces the probability distribution (2) predicted by quantum theory.

4. Four Operators and Bell's Map

For any ii, 12, v1, v2 G [0, n] we consider the self-adjoined operators A^, Bvj, i, j = 1,2, see Subsection 2.1. We extend notation introduced in Sections 2 and 3 in a natural way: 0j =

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

£sin2( V—j, 0j G [0,2 ], Al, Bvj, I (Ai, Bvj ) = |e(0ij )| ln2, i, j = 1,2. The sum I (A, B) = E2,j=11 (Ai, Bvj. ) is said to be the average quantity of information of one of the pairs of experiments A = {A^1, A^2} and B = {Bv1, Bv2} relative to the other, or, total information flow. The sum I(s)(A, B) = E?-=1 I(s)(Ai, Bv ) is said to be the average quantity of signed information of one of the pairs of experiments A = {Aw, A^2} and B = {Bv1, Bv2} relative to the other, or, total signed information flow.

Thus, we obtain the functions

2

I (A, B) : [0, n]4 ^ R, (i1,i2, v1, v2 ) ^ (ln2) £ |e(0j )|,

i,j=1

and

2

I(s)(A, B) : [0, n]4 ^ R, (i 1,i2, v1, v2) ^ (ln2) £ e(0j),

i,j=1

which represents the intensity of information flow (respectively, signed information flow) between the pairs of experiments A and B. We note that 0 < I (A, B)( h,}i 2, v1, v2 ) < 4ln2 and —4ln2 < I(s)(A, B)(ii 1, l2, v1, v2) < 4ln2.

In case 11 = i2 = v1 = v2 we have 011 = 012 = 021 = 022 = 0, e(011 ) = e(012) = e(021 ) = e(022 ) = —1, hence I (A, B) = 4ln2 and I (s)(A, B) = — 4ln2. In case ii1 = ii2 = f, v1 = v2 = 0 we have 011 = 012 = 021 = 022 = 1, e(011) = e(012) = e(021 ) = e(022 ) = 0, and I (A, B) = 0. Finally, in case ii1 = i2 = n, v1 = v2 = 0 we have 011 = 012 = 021 = 022 = 2, e(011 ) = e(012) = e(021 ) = e(022) = 1, and I(s)(A, B) = 4ln2.

Since the image of a compact and connected set via continuous function I (A, B) (respectively, the continuous function I(s)(A, B)) is a compact and connected subset of R, we obtain that the range of I (A, B) (respectively, I(s)(A, B)) coincides with the interval [0,4ln2] (respectively, with the interval [—4ln2,4ln2]).

4.1. Bell's Inequality

The equality |Ap1 Bv1 + Ap1 Bv2 + Ap2 Bv1 — Ap2 Bv21 = 2 yields (with an abuse of the probability theory) Bell's inequality

|E(Ap1 Bv) + E(Ap,Bv2) + E(Ap2Bn) — E(Ap2Bv21 < 2, that is, |b(p1,p2,v1,v2)| < 2, where b(p1,p2,v1,v2) = cos(p1 — v1) + cos(p1 — v2) + cos(p2 —

v0 — cosp — v2).

J. S. Bell in [2] proves that if there exist "...additional variables which restore to the (quantum) theory causality and locality", then the above inequality is satisfied. Since I(A, B) = 0 is equivalent to the equalities |p — vj | = 2, i, j = 1,2, this yields b = 0. Thus, we obtain that if Bell's inequality is violated, then the total information flow I (A, B) is strictly positive, that is, the experiments A and B are informationally dependent.

Examples 1. Note that the results of all calculations below are rounded up to the 7-th digit.

1) (Aspect's experiment) p1 = f, p2 = , v1 = 4, v2 = 0. Then we obtain cos( f) = 0.9238795, cos() = 0.3826834, and therefore b(§, 3r, 4,0) = 2.3889551. On the other hand, 011 = 012 =

021 =8 2 sin2(16) = 0.0190301, e(011) = e(012) = e(021) = —0.0415353, 022 = 2 sin2(16) = 0.154329, e(022) = —0.1084492. Hence we have I(A, B) = 0.1615415 and I(s)(A, B) =2 —0.23350551.

2) p1 = n, p2 = , v1 = 0, v2 = f. Then we have b(n, 2r,0, f) = —2.5. On the other hand,

011 = 1 sin2(f) = 0.5, e(011) = 1, 012 = 021 = 1 sin2(f) = 3 = 0.375, e(012) = e(021) = 0.1887219,

022 = 1 sin2 (f) = 1 = 0.125, e(022) = —0.1887219. Hence we obtain I (A, B) = 1.0855833 and I(s)(A, B) = 1.1887219

3) p1 = f, p2 = 0, v1 = f, v2 = 3t. Then b( ^2,0, f, f-) = 2^2. On the other hand, 011 =

012 = 021 = 2sin2(n) = 0.0732233, e(011) = e(012) = e(021) = —0.3994425, 022 = 1sin2(|6) = 0.154329, e(022) = —0.10844492. Hence we have I (A, B) = 0.9053727 and I(s)(A, B) = —0.22827767,

4) p1 = n, p2 = 0, v1 = 0, v2 = n. Then we have b(n,0,0, — n) = 2. On the other hand, 011 = 022 = 1 sin2(n) = 0.5, e(011) = e(022) = 1, 012 = 021 = 2 sin2(0) = 0, e(012) = e(021) = —1. Therefore we obtain I(A, B) = 4ln2 = 2.7725887 = max I(A, B) and I(s)(A, B) = 0.

5) p1 = 5t, p2 = 2r, v1 = 3, v2 = In this case we have b(, 2r, 3' n) = 1 — ^Z3. On the other hand, 011 = 1sin2(n) = 4, e(011) = 0, 012 = 021 = 1 sin2(f) = e(012) = e(021) = —0.1887219, 022 = | sin2(12) = 0.0334936, e(022) = —0.6453728. Hence we have i(a, B) = 0.7089624 and I(s)(A, B) = —0.267929.

6) The link

http://www.math.bas.bg/algebra/valentiniliev/ contains a Java experimental implementation "dependencemeasurements2" depending on five parameters: an non-negative integer n and four real numbers p1, p2, v1, v2 from the closed interval [0, n]. One can also find the description of this program at the above link.

Examples 1 and, especially, example 6), yield that the relations between I (A, B) and b, and I(s)(A, B) and b seem to be random. Below we present an attempt to explain the uncertainty of this relation by refereing to [6, 5.4, 5.5]. We define the events

U = {(p1,p2,v1,v2) £ [0,n]4||b(p1,p2,v1,v2)| < 2}, V = {(p1, p2, v1, v2) £ [0, n]4| I (A, B)(p1, p2, v1, v2) £ [0,2 ln 2]}, VS = {(p1,p2,v1,v2) £ [0, n]4|I(s)(A,B)(p1,p2,v1,v2) £ [0,4ln2]},

with complements Uc, Vc, and VSc in [0, n]4. We suppose that the probabilities a = pr(U), P = pr(V), and P(s) = pr(VS) in the sample space [0, n]4 furnished with normalized Borel measure are known. The probabilities t = pr(U n V) and T(s) = pr(U n VS) run through the closed intervals I(a, p) = [max(0, a + p — 1),min (a,p)] and I(a,p(s)), respectively. In case t = min(a, p)] (respectively, t(s) = min(a, P(s))]) there exists a relation of inclusion (up to

Valentin V. Iliev RT&A, No 2 (78)

DEPENDENCE OF SIMULTANEOUS MEASUREMENTS Volume 19, June, 2024

a set of probability 0) between U and V (respectively, between U and VS). Otherwise, both conditional probabilities pr(Vc|U) and pr(V|Uc) (respectively, pr(VSc|U) and pr(VS|Uc)) can not be simultaneously as small as one wants (a kind of uncertainty principle).

Remark 1. The probabilities a = pr(U), p = pr(V), t = pr(U n V), and T(s) = pr(U n VS) can be approximated by using Examples 1, 6). We draw a random sample X of size n from the sample space [0, n]4 and consider X as a sample space with n equally likely outcomes. Then

the probabilities a(n), p(n), T(n), and T(s)(n) of the traces of U,V, U n V, and U n VS on X (the sample proportions) are unbiased estimators for a, p, t, and T(s) when n is large.

Note that as an output of n iterations of the random process from example 6) we can also find: a) the approximation [L(n), J(n)] of the range of I (A, B) and the approximation [LS(n), JS(n)] of the range of I(s)(A,B) under the condition |b| < 2, b) the above sample

proportions, and c) the approximations pr(Vc|U)(n) = a^(J)^, pr(V|Uc)(n) = , and

pr(VSc|U)(n) = , pr(VS|Uc)(n) = .

Below are the results obtained by drawing a random sample of size n = 1000:

a(1000) = 0.838, p(1000) = 0.704, T(1000) = 0.614, T(s)(1000) = 0.087, pr(Vc|U)(1000) = 0.2673031, pr(V|Uc)(1000) = 0.5555555, pr(VSc|U)(1000) = 0.8961814, pr(VS|Uc)(1000) = 0.0185185.

Acknowledgements

I thank Dimitar Guelev for making an experimental Java implementation of the evaluation of dependence of simultaneous measurements and the approximation of various parameters via a random process. The numerical examples thus produced were invaluable for my work. I would like to express my gratitude to administration of the Institute of Mathematics and Informatics at the Bulgarian Academy of Sciences for creating perfect and safe conditions of work.

References

[1] Aspect A., Dalibard J., Roger G. (1982). Experimental Test of Bell's Inequalities Using Time-Varying Analysers. Physical Review Letters, 49 No 25:1804-1807.

[2] Bell J. (1964). On the Einstein Podolski Rosen Paradox. Physics, 1: 195-200.

[3] Gelfand I. M., Kolmogorov A. N., Yaglom A. M. Amount of Information and Entropy for Continuous Distributions. Mathematics and Its Applications, Selected Works of A. N. Kolmogorov, III: Information Theory and the Theory of Algorithms, 33-56, Springer Science+Business Media Dordrecht, 1993.

[4] Kolmogorov A. N. Foundations of the Theory of Probability, Chelsea Publishing Company New Yourk, 1956.

[5] A. Einstein, B. Podolsky, N. Rosen. (1935). Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? Physical Review, 47:777-780.

[6] V. V. Iliev. (2021). On the Use of Entropy as a Measure of Dependence of Two Events. Reliability: Theory & Applications, 16 No 3:237-248.

[7] C. E. Shannon. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27 No 3:379-423, 27 No 4:523-656.

i Надоели баннеры? Вы всегда можете отключить рекламу.