Научная статья на тему 'A particular case of a sequential growth of an x-graph'

A particular case of a sequential growth of an x-graph Текст научной статьи по специальности «Математика»

CC BY
117
26
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
CAUSAL SET / RANDOM GRAPH / DIRECTED GRAPH / ПРИЧИННОСТНОЕ МНОЖЕСТВО / СЛУЧАЙНЫЙ ГРАФ / ОРИЕНТИРОВАННЫЙ ГРАФ

Аннотация научной статьи по математике, автор научной работы — Krugly A.L.

A particular case of discrete spacetime on a microscopic level is considered. The model is a directed acyclic dyadic graph (an x-graph). The dyadic graph means that each vertex possesses no more than two incident incoming edges and two incident outgoing edges. The sequential growth dynamics of this model is considered. This dynamics is a stochastic sequential addition of new vertices one by one. The probabilities of different variants of addition of a new vertex depend on the structure of existed x-graph. It is proved that the algorithm to calculate probabilities of this dynamics is a unique solution that satisfies some principles of causality, symmetry and normalization. The algorithm of sequential growth can be represented as following tree steps. The first step is the choice of the addition of the new vertex to the future or to the past. By definition, the probability of this choice is 1∕2 for both outcomes. The second step is the equiprobable choice of one vertex number V. Then the probability is 1∕N, where N is a cardinality of the set of vertices of the x-graph. If we choose the direction to the future, the third step is a random choice of two directed paths from the vertex number V. A new vertex is added to the ends of these paths. If we choose the direction to the past, we must randomly choose the two inversely directed paths from the vertex number V. The iterative procedure to calculate probabilities is considered.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «A particular case of a sequential growth of an x-graph»

UDC 530.12:539.12

A Particular Case of a Sequential Growth of an X-Graph

A. L. Krugly

Department of Applied Mathematics and Computer Science

Scientific Research Institute for System Analysis of the Russian Academy of Science 36, k. 1, Nahimovskiy pr., Moscow, Russia, 117218

A particular case of discrete spacetime on a microscopic level is considered. The model is a directed acyclic dyadic graph (an x-graph). The dyadic graph means that each vertex possesses no more than two incident incoming edges and two incident outgoing edges. The sequential growth dynamics of this model is considered. This dynamics is a stochastic sequential addition of new vertices one by one. The probabilities of different variants of addition of a new vertex depend on the structure of existed x-graph. It is proved that the algorithm to calculate probabilities of this dynamics is a unique solution that satisfies some principles of causality, symmetry and normalization. The algorithm of sequential growth can be represented as following tree steps. The first step is the choice of the addition of the new vertex to the future or to the past. By definition, the probability of this choice is 1/2 for both outcomes. The second step is the equiprobable choice of one vertex number V. Then the probability is 1/N, where N is a cardinality of the set of vertices of the x-graph. If we choose the direction to the future, the third step is a random choice of two directed paths from the vertex number V. A new vertex is added to the ends of these paths. If we choose the direction to the past, we must randomly choose the two inversely directed paths from the vertex number V. The iterative procedure to calculate probabilities is considered.

Key words and phrases: causal set, random graph, directed graph.

1. Introduction

By assumption spacetime is discrete on a microscopic level. In this paper I continue the previous investigation [1] of a particular model of such discrete pregeometry. This is a directed acyclic dyadic graph. The dyadic graph means that each vertex possesses no more then two incident incoming edges and two incident outgoing edges. The vertex with 4 incident edges forms an x-structure. Then such graphs are called x-graphs.

The goal of this model is to describe particles as some repetitive symmetrical self-organized structures of an x-graph. This self-organization must be the consequence of dynamics. In this paper, I introduce an example of dynamics.

Some vertices of a finite x-graph have less than four incident edges. These vertices have free valences instead the absent edges. These free valences are called external edges as external lines in Feynman diagrams. They are figured as edges that are incident to only one vertex. There are two types of external edges: incoming external edges and outgoing external edges. The number of incoming external edges is equal to the number of outgoing external edges for any x-graph.

Each x-graph is a model of a part of some process. The task is to predict the future stages of this process or to reconstruct the past stages. We can reconstruct the x-graph step by step. The minimal part is a vertex. We start from some given x-graph and add new vertices one by one. This procedure is called 'a classical sequential growth dynamics'.

We can add a new vertex to external edges only. This procedure is called an elementary extension. There are four types of elementary extensions. There are two types of elementary extensions to outgoing external edges. This is a reconstruction of the future of the process. First type is an elementary extension to two outgoing external edges. Second type is an elementary extension to one outgoing external edge. Similarly, there are two types of elementary extensions to incoming external edges. These elementary extensions reconstruct the past evolution of the process. Third

Received 26th March, 2015.

I am grateful to Alexander V. Koganov and Vladimir V. Kassandrov for extensive discussions on this subject, and Ivan A. Tserkovnikov for collaboration in a numerical simulation.

type is an elementary extension to two incoming external edges. Fourth type is an elementary extension to one incoming external edge. We can prove that we can get every connected x-graph from one vertex by a sequence of elementary extensions of these four types.

By assumption, the dynamics of this model is a stochastic dynamics. If we have an algorithm to calculate the probabilities of elementary extensions for any x-graph, we can calculate the probabilities of any variant of the future or the past for any given x-graph as a classical stochastic sequence of elementary extensions.

2. An Amplitude of Causal Connection

Define an amplitude of causal connection. All probabilities of elementary extensions are functions of these amplitudes.

Consider a directed path. If we choose a directed path from any incoming external edge number a, we must choose one of two edges in each vertex (Fig. 1). Assume the equal probabilities for both outcomes independently on the structure of the x-graph. Then this probability is equal to 1/2. Consequently if a directed path includes k vertices, the choice of this path has the probability 2-fc. We have the same choice for opposite directed path.

0

Figure 1. A choice of a directed path is a sequence of binary alternatives

Number outgoing external edges by lowercase Latin indices. Number incoming external edges by lowercase Greek indices. These Latin and Greek indices range from 1 to n, where n is the number of outgoing or incoming external edges. Introduce an amplitude riia of causal connection of the outgoing external edge number i and the incoming external edge number a. By definition, put

M

aia = aai = £ 2-k(m), (1)

m= 1

where M is the number of directed paths from the incoming external edge number a to the outgoing external edge number i, and k(m) is the number of vertices in the path number m. This is the probability to reach the edge number i if we start from the edge number a.

We can start from any vertex. Number vertices by capital Latin indices. These indices range from 1 to N, where N is the number of vertices in the x-graph. Introduce an amplitude a^ of causal connection of the outgoing external edge number i and the vertex number V. By definition, put

M

CHV = aVl = £ 2-k(m), (2)

rn=l

where M is the number of directed paths from the vertex number V to the outgoing external edge number i, and k(m) is the number of vertices in the path number m including vertex number V.

Similarly, introduce an amplitude aav of causal connection of the incoming external edge number a and the vertex number V. By definition, put

M

aav = aVa = £ 2-fcH (3)

where M is the number of directed paths from the incoming external edge number a to the vertex number V, and k( m) is the number of vertices in the path number m including vertex number V.

3. An Algorithm to Calculate the Probabilities

Consider a following algorithm to calculate the probabilities of elementary extensions [2]. There are three steps.

The first step is the choice of the elementary extension to the future or to the past. By definition, the probability of this choice is 1/2 for both outcomes.

The second step is the equiprobable choice of one vertex number V. Then the probability is 1/N, where N is a cardinality of the set of vertices of the x-graph.

The third step is the choice of external edges that take part in the elementary extension. If we choose the elementary extension to the future at the first step, and the vertex number V at the second step, the probability of the choice of the first edge number i is aiv and the probability of the choice of the second edge number j is ajv. If we choose the elementary extension to the past at the first step, the probability of the choice of the first edge number a is aav and the probability of the choice of the second edge number ß is aßv.

We get for the probability of the elementary extension of the first type 1 N 1 N

pi = 2n ^ (aivajv + ajvaiv) = NN Y1 aivajv. (4)

V=1 V=1

We have two summands because if we choose the edge number at first and the edge number secondly or we choose these edges in the reverse order, we get the same elementary extension.

If = , we get for the probability of the elementary extension of the second type

1 N

p-" = 2N £ (a*v)2. (5)

V=1

Similarly, we get for the probabilities of the elementary extensions of the third and fourth types

1 N 1 N

Paß = 2N (awaßv + aßvaav) = n aavaßv, (6)

y=1 y=1

1 N

paa = T^J2(aav )2. (7)

2N V=1

Consider a following form of the third step. Suppose we choose the direction to the future and the vertex number V at the first and second steps. In this case, the third

step is a random choice of two directed paths from the vertex number V (Fig. 2). The ends of these paths are the outgoing external edges that take part in the elementary extension. If we choose the direction to the past at the first steps, we must randomly choose the two inversely directed paths from the vertex number V.

4. Physical Foundations of the Algorithm to Calculate the

Probabilities

Consider a following form of the algorithm to calculate the probabilities of elementary extensions and prove that this algorithm is a consequence of the causality principle.

The first step is the same. This is the equiprobable choice of of the elementary extension to the future or to the past.

A new vertex is added to one or two external edges. The second step is the choice of one external edge that takes part in the elementary extension. This is an outgoing external edge if we choose the future evolution at the first step. Otherwise this is an incoming external edge. In the previous paper [1] the trivial case was considered. This is the equiprobable choice. The probability is 1/n for each outcome. In this paper we consider the case that satisfies the causality principle. Denote by pi the probability to choose the outgoing external edge number i. Denote by pa the probability to choose the incoming external edge number a.

In this model, causality is defined as the order of vertices and edges. But the causality has a real physical meaning only if the dynamics agrees with causality. The probability to choose an outgoing external edge can only depend on the x-subgraph that precedes this edge. Similarly, the probability to choose an incoming external edge can only depend on the x-subgraph that follows this edge.

According to the causality principle the normalization constant cannot depend on the structure of the x-graph. By definition, it is equal to N-1, where N is the number of vertices in the x-graph. This is the number of the steps of the sequential growth if we start from the empty x-graph. In the previous case [1] the normalization constant 1/n depends on the structure of the x-graph. This contradict the causality principle.

Consider the x-graph Q. By definition, put V(V) = {W e Q | W ^ V}. The set V(V) is called the past set of the vertex number V. By definition, put T{V) = {W e Q | V -< W}. The set T(V) is called the future set of the vertex number V. If the outgoing external edge number i is incident the vertex number V, by definition, put V(i) = V(V). If the incoming external edge number a is incident the vertex number V, by definition, put T(a) = T(V).

Theorem 1. Consider the x-graph Qn that consists of N vertices. Consider the probability Pi(0N) to choose the outgoing external edge number i that is normalized by 1. If pi(Qn) is a function of V(i) and the normalization constant is N-1, then

g

v

Figure 2. Two directed paths

Pî(QN) = N 1Yjy=i aiV(QN).

Proof. The proof is by induction on N. If N = 1, pi(Qi) = P2(Gi) = 1/2 by symmetry.

By the inductive assumption, the theorem is truth for any x-graph Gn-1 that consists of N — 1 vertexes. Consider any x-graph Gn that consists of N vertices. We can get this x-graph by an addition a new vertex number N to some Gn-1. Let the vertex number N be a maximal vertex. If it is not a maximal vertex, choose some maximal vertex number W in Gn and remove it. Reverse the numbers W and N. The new vertex get the number W, and the removed vertex get the number N. We get Gn-1. It can be unconnected. The theorem is truth for Gn-1 by assumption. Add the vertex number N to Gn-1. There are two cases. In the first case, the vertex number N is added to two outgoing external edges numbers and as for an elementary extension of the first type. In the second case, the vertex number N is added to one outgoing external edge number as for an elementary extension of the second type. Denote by n the number of outgoing external edges in Gn-1.

Consider the first case. We have the normalization condition for Gn-1.

n N-1 N-1

^ ^asv (Gn-1)+ ^(aiv (Gn-1) + ajV (Gn-1)) = N — 1. (8)

(s=1)a(s=i)a(s=j) V=1 V=1

Two outgoing external edges numbers i and j become internal edges. We get two free numbers of outgoing external edges: i and j. Two new outgoing external edges appear. Number these new outgoing external edges by i and j. If s = i and s = j,

V( s) is not changed. We have

1 N-1 1 N

Ps(Gn) = asV-1) = a,sV). (9)

y=1 y=1

We have the normalization condition for Gn.

n N

^ ^ asv(Gn) + NPi(Gn) + Np,(Gn) = N. (10)

(s=1)a(s=i)a(s=j) V=1

In Gn, V(i) = V(j) and pi(Gn) = Pj(Gn). Using (8) and (10), we get

,N-1 ^ 1

1 /i\ -1 iv -1 \

Pi (Gn ) = Pj (Gn ) = TNy^h CHv (^n-1) + ^ ajV (Gn-1) + 1 j. (11)

^y=1 y=1 '

The last equation is the rule to calculate the amplitudes. New outgoing external edges are included in the same paths in which the old outgoing external edges numbers and are included. These paths pass through one new vertex. Then we must multiply by 1/2. Also we must add the amplitude of the new vertex that is equal to 1/2. We

get

1 N 1 N

Pi (Gn) = Pj (GN) = NYI aiV (<5n) = N X) aiv (<3n). (12)

y=1 y=1

Consider the second case. We have the same normalization condition (8) for Gn-1. One outgoing external edge number i becomes an internal edge. We get i as free number of an outgoing external edge. Two new outgoing external edges appear. Number these new outgoing external edges by i and n + 1. If s = i, V(s) is not changed. We

66 Bulletin of PFUR. Series Mathematics. Information Sciences. Physics. No 3,2015. Pp. 61-73 have the equation (9). We have the normalization condition for Gn.

n N

Y ^ asV(Gn) + NPi(gN) + Npn+1(gN) = N. (13)

(s=1)a(s=i) V=1

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

In Gn, V(i) = V( n + 1) and Pî(Gn) = Pu+i(Gn). Using (8) and (13), we get

,N -1

2 N

Pî(Gn) = Pu+I(Gn) = 7^ [Y aiV(Gn—i) + 1 ). (14)

W=1

The last equation is the rule to calculate the amplitudes. New outgoing external edges are included in the same paths in which the old outgoing external edge number i is included. These paths pass through one new vertex. Then we must multiply by 1/2. Also we must add the amplitude of the new vertex that is equal to 1/2. We get

1 N 1 N

Pî(Gn) = Pu+I(Gn) = N Y aiV) = N ^ a(™+i)v). (15)

y=1 y=1

We have proved the theorem for all cases. □

Cons 1. Consider the x-graph Gn that consists of the N vertices. Consider the probability pa to choose the incoming external edge number a that is normalized by 1. If pa is a function of T(a) and the normalization constant is N-1, then pa =

N-1E y = 1 &aV.

The proof is the same.

The third step is the choice of the second external edge. Denote by p^ the probability to choose the outgoing external edge number if we choose the outgoing external edge number i at the second step. According to the causality principle p^ can only depend on the x-subgraph that precedes the edges numbers and . Denote by pa@la the probability to choose the incoming external edge number ft if we choose the incoming external edge number a at the second step. Similarly, can only

depend on the x-subgraph that follows the edges numbers a and ft.

Consider the condition of symmetry. If p^ = p^, we get the algorithm for p^ that is considered in [1]. But in this case, ptpijii = pjpijy. The physical condition is ViViHi = pjpijlj.

Let prove that (4)-(7) is a unique solution that satisfies the principles of causality, symmetry, and normalization.

Theorem 2. Consider the x-graph Gn that consists of N vertices and has n outgoing external edges. Consider the conditional probability pijii(GN) to add a new maximal vertex number N + 1 to the outgoing external edges numbers i and j if we choose the outgoing external edge number . The edges and can coincide. If

- pijli(GN) is a function of V( N + 1) (the causality principle),

- J21j=1 pijli(GN) = 1 (the normalization), and

- pi (Gn)pijli(SN) = pj (Gn)pijy (Gn) (the symmetry), then

1 N

pi(GN )piHi (Gn ) = -^yl aiV )aiy )■

v=1

Proof. The proof is by induction on N. If N = 1,

P1(Gn )Pu\1(Gn ) = P1(Gn )p 1211 (Gn ) = P2(Gn )P21\2(Gn ) = P2,(Gn )P22\2(Gn ) = 1/4,

by symmetry.

By the inductive assumption, the theorem is truth for any x-graph Gn-1 that consists of N — 1 vertices. Consider any x-graph Gn that consists of N vertices. We can get this x-graph by an addition a new vertex number N to some Gn-1. Let the vertex number N be a maximal vertex. If it is not a maximal vertex, choose some maximal vertex number W in Gn and remove it. Reverse the numbers W and N. The new vertex get the number W, and the removed vertex get the number N. We get Qn-1. It can be unconnected. The theorem is truth for Gn-1 by assumption. Add the vertex number N to Gn-1. There are two cases. In the first case, the vertex number N is added to two outgoing external edges numbers and as for an elementary extension of the first type. In the second case, the vertex number N is added to one outgoing external edge number as for an elementary extension of the second type. Denote by n the number of outgoing external edges in (Gn-1.

Consider the first case. We have n normalization conditions for (Gn-1.

1 n N -1

N i y] ^ asV(Gn-1)amv(Gn-1)+

(s=1)a(s=i)a(s=j) V=1 1 ^

+ N _ 1 ^ (lmV -1){aiv (Gn-1) + ajV (GN-O) =

v=1

1 N-1

= pm(GN-1) = N 1 ^ amv(GN-1), (16)

v =1

where m ranges from 1 to n. Two outgoing external edges numbers i and j become internal edges. We get two free numbers of outgoing external edges: i and j. Two new outgoing external edges appear. Number these new outgoing external edges by i and j. If s = i, s = j, m = i, and m = j, V(s) and V(m) is not changed. We have

N — 1 ~ ~

Pm(GN )Pms\m(GN ) = -jrj—Pm(GN-1)Pms\m (Gn-1) =

N

1 1 1

= asV -1)amV -1) = asV )a™y )' (17)

1 ^ „ , 1 ^

N ^ „ n

V=1 V=1

We have n — 2 normalization conditions for Gn .

1 n N

n ^2 ^2 a-sv )amv )+

(s=1)a(s=i)a(s=j) V=1

+ Pm(GN ) Pmi\m(GN ) + Pm(G N )Pmj\m(G N ) =

1 N N — 1

= Pm(GN) = -J^Yl amV(Gn) = - Pm(GN-1), (18)

V =1

where m ranges from 1 to n, m = i, and m = j. Using the symmetry of edges numbers i and j, (16), and (18), we get

Pm(GN )pmi\m(GN ) = Pm(&N )pmj\m(GN ) =

1 N-1

= 2N ^ a^v(Gn-1) (aiV (Gn-1) + ajV (Gn-1)) • (19)

The last equation is the rule to calculate the amplitudes. New outgoing external edges are included in the same paths in which the old outgoing external edges numbers i and j are included. These paths pass through one new vertex. Then we must multiply by 1/2. We get

aiV(Qn) = ajV(Qn) = 2iaiV(<^N+ ajV(<^N'

The outgoing external edge number m is not changed, and amv(Qn-i) = amv(Qn). We have

Pm(QN )pmi\m(QN ) = Pm(QN )pmj\m(QN ) =

1 N-1 i N-1

= Y/ amv (Qn )aiv (Qn ) = — ^ amv (Qn )ajv (Qn )• (20)

y=i V=1

By symmetry, we have

Pi(QN) Pn\i(QN) = Pi(QN )Pij\i(QN) = Pj (Qn )Pji\j (Qn ) = Pj (Qn )Pjj\j (Qn )• We have two last equal normalization conditions for Qn .

Pi(QN )Ph\i(Qn ) + Pi(QN )Pij\i(QN ) +

n

+ Y Pi(QN )Pim\i(SN) =

(m=1)A(m=i)A(m=j)

= Pi(Qn) = "-^^(Qn-1)+ Pj (Qn-1)) + 2—• (21)

There is an analogous normalization condition for permuted and indices. By the assumption, Pi(QN)Pim\i(QN) = pm(QN)pmi\m(QN). Using (16) for m = i, m = j, and (20), we get

Pi(QN )Pii\i(QN ) = Pi(QN )Pij\i(QN ) =

1

L\ - 1 \

4 - \ Y {aiv(Qn-1) + ajV(Qn-1)) (aiV(Qn-1) + ajV(Qn-1)) + 1 j • (22) Vy=1 '

The last equation is the rule to calculate the amplitudes. We get

Pj (Qn )Pji\j (Qn ) = Pj (Qn )Pjj\j (Qn ) = Pi(QN) Pu\i(QN) =

1 N

= Pi(QN )Pij\i(QN) = — Y aiv (Qn )ajv (Qn ) =

v=1

1 N 1 N

= — Y/ aiv (Gn )aiv (Qn ) = — ^ ajv (Qn )ajv (Qn )• (23)

y=1 y=1

Consider the second case. We have n normalization conditions for Qn-1.

1 n N-1

Yj a*v (Qn-1)amv (Qn-1)+

N - 1

(s=1)A(s=i) V=1

1 N-1

+ — aiV(Gn-1)amv(Gn-1) = Pm(GN-1), (24)

v=1

where m ranges from 1 to n. One outgoing external edge number i becomes an internal edge. We get i as free number of an outgoing external edge. Two new outgoing external edges appear. Number these new outgoing external edges by i and n + 1. If s = i and m = i, V(s) and V(m) is not changed. We have the equation (17). We have n — 1 normalization conditions for Gn.

1 n N

N ^2 ^2asV (Gn)amV (Gn)+ Pm(GN)Pmi\m(GN) +

(s=1)A(s=i) V=1

N — 1 ~

+ Pm(GN )Pm(n+1)\m(GN ) = Pm(GN ) = — -1), (25)

where m ranges from 1 to n and m = . By symmetry, we have Pm(GN )Pmi\m(GN ) = Pm(GN )Pm(n+1)\m(G N )• Using (24) and (25), we get

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

N -1

2N

Pm(GN) Pmi\m(GN ) = Pm(G N )Pm(n+1)\m(G N) = X/ amV (<^N-1)aiV (Gn-1). (26)

V=1

The last equation is the rule to calculate the amplitudes. New outgoing external edges are included in the same paths in which the old outgoing external edge number is included. These paths pass through one new vertex. Then we must multiply by 1/2. We get

aiV (Gn ) = a(n+1)y (Gn ) = 2aiv -1).

The outgoing external edge number m is not changed, and amv(Gn-1) = amv(Gn). We have

Pm(GN )Pmi\m(GN ) = Pm(GN )Pm(n+1)\m(G N ) =

1 N 1 N

= j- amV)aiV) = N amV)a(n+1)v(Gn)• (27) y=1 y=1

By symmetry, we have

Pi(GN )Pii\i(GN ) = Pi(GN )Pi(n+1)\i(GN ) =

= Pn+1(GN)p(n+1)i\n+1(GN) = Pj (Gn)P( +1)\u+1(Gn). (28)

We have two equal last normalization conditions for Gn .

Pi (Gn )Pn\i(GN) + Pi (Gn )Pi(n+1)\i(GN)+

n N — 1 1

+ Pi (<5n)Pim,\i (Gn)= Pi(GN)= 2— Pi(GN-1) + 2—. (29)

(rn=1) A (m=i)A (rn=n+1)

There is an analogous normalization condition for permuted and n + 1 indices. By the assumption, pi(Gn)Pim\i(GN) = pm(&N)pmi\m(GN). Using (24) for m = i,

and (27), we get

1 fN-1 _ \ Pi(GN )pn\i (Gn ) = Pi(GN) Pi(n+l)\i (Gn ) = JTr i Y aiV (&N-1)aiV (&N-l) + 1 j . (30)

V=1 '

The last equation is the rule to calculate the amplitudes. We get

Pu+1(Gn ) P(n+1)i\n+1(QN ) = Pu+1(Gn ) P( + 1)\u+1(Gn) =

1 N

= Pi(0N )Pii\i(0N ) = Pi (Gn )Pi(n+1)\i(GN ) = lY; aiV )(liV ) =

V=1

1 N 1 N

= - Y/ liv )a(n+1)V (Gn ) = Jy X/ a(™+1)^ )a(n+1)V (Gn ). (31)

y=1 y=1

We have proved the theorem for all cases. □

Cons 2. Consider the x-graph Q^ that consists of N vertices and has n incoming external edges. Consider the conditional probability Pap\a(GN) to add a new minimal vertex number N + 1 to the incoming external edges numbers a and fl if we choose the incoming external edge number a. The edges a and fl can coincide. If

- Pa@\a(Gn) is a function of F(N + 1) (the causality principle),

- 127/3=1 Pa@\a.(Gn) = 1 (the normalization), and

- P»(Gn)Pap\a(GN) = Pi3(Gn)Ppa\p(Gn) (the symmetry), then

1 N

Pa (Gn )P

(Gn) = N Y aaV)ai3V).

V =1

The proof is the same.

The introduced algorithm to calculate the probabilities of elementary extensions is a combinatorial rule that is a consequence of causality, symmetry, and normalization.

5. An Iterative Procedure to Calculate the Probabilities of

Elementary Extensions

Consider the x-graph Gn that consists of N vertices and has n outgoing external edges. We must calculate 2 Nn amplitudes aiv (Gn) for the direct computation of all probabilities of elementary extensions. But if we consider the sequential growth of Gn, we can calculate all probabilities of elementary extensions using the probabilities of elementary extensions for Gn-i. Consider this iterative procedure.

Denote Npi(GN)Pij\i(GN) and Npa(GN)paß\a(GN) by pij(Gn) and paß(Gn) respectively for simplicity. Consider these probabilities of elementary extensions as elements of matrixes. Introduce a matrix p/(Gn) of probabilities of elementary extensions to the future. All matrixes are denoted by bold Latin letters. An element number ij of pf (Gn) is equal to pij (Gn). Introduce a matrix pp(Gn) of probabilities of elementary extensions to the past. An element number aß of pp(Gn) is equal to paß(Gn). The sum of the elements for each matrix is equal to N. We will need aiv (Gn), where the vertex number V possesses the incident incoming external edge number a, and aav (Gn), where the vertex number V possesses the incident outgoing external edge number i. Such amplitudes is aia(GN). Introduce a matrix &(Gn) of amplitudes. An element number ia of &(Gn) is equal to aia(GN). The iterative procedure to calculate &(Gn) is considered in [1]. We need to calculate no more than n amplitudes for each elementary extension. These three matrixes are square matrixes of size n, p/(Gn) and pp(Gn) are symmetrical matrixes.

This procedure starts from the x-graph Q1 that consists of 1 vertex.

p/(G1)=mQ1)=(1/4 1/4) • (32)

We considered the iterative calculation of the probabilities in the proof of Theorem 2. Rewrite the equations in the more useful form for p/ and add the equations for pp.

First type is an elementary extension to the future. Two outgoing external edges numbers and become internal edges. We get two free numbers of outgoing external edges: and . Two new outgoing external edges appear. Number these new outgoing external edges by and . We have

Pu(Gn ) = Pij (Gn ) = Pjj (Qn ) =

= \ {pii (Gn -1 ) + Pji(ÖN -i) + Pu(Gn -i) + Pjj (Gn -i) + l), (33)

Pis(GN) = Pjs(GN) = 1 (pis(GN-l) + Pjs(GN-l)) , (34)

Pms(GN ) = Pms (Gn-l), (35)

where s and m range from 1 to n, s = i, s = j, m = i, and m = j. We have the addition of a new summand to the elements of pp that describes the addition of the new vertex.

Paß (Gn ) = Paß (Gn -l) + aia(QN )aiß (Qn ). (36)

Second type is an elementary extension to the future too. One outgoing external edge number i becomes an internal edge. We get i as free number of an outgoing external edge. Two new outgoing external edges and one new incoming external edge appear. Number these new outgoing external edges by i and n + 1, and new incoming external edge by n +1. We have new column number n + 1 and new row number n + 1 in pf and pp. We have

Pü(Qn ) = Pi(n+1)(GN) = P(n+1)(n+1)(GN) = j(pü(Qn-1) + ^ , (37)

Pis (Qn ) = P(u+I)s(Qn ) = 2 Pis (Qn-1), (38)

Pms (Qn )= Pms (Qn-1), (39)

where and m range from 1 to n, = , and m = .

Paß (Qn ) = Paß (Qn -1) + aia(QN )a,iß (Qn ). (40)

If ß = n + 1

Pa(n+1)(GN) = ^aia(QN), (41)

if a = ß = n +1

P(n+1)(n+1)(QN ) = 4. (42)

If we interchange the Latin and Greek indices in (33)-(42), we get the equations for the elementary extensions of the third and fourth types.

We must calculate no more than n2 elements of pp, n elements of p/, and n elements of a for each elementary extension. If n growth as N1/2, we have the linear growth of the number of calculations.

6. Properties of the Sequential Growth

Theorem 3. The maximal values of Pij and Paß are equal to 1/4 if i _ j and a _ ß respectively. The maximal values of Pa and Paa are equal to 1/8.

Proof. There are N summands in Y1 y=1 aivajv and the maximal value of the amplitude is 1/2, then Y1 y=1 aivajv ^ N/4. Using (4), we get Pij < 1/4. Using (5), we get Pu < 1/8. Similarly, Paß < 1/4 and Paa < 1/8. □

We have the maximal values of probabilities for Q1.

Theorem 4. The probability to add a set V of vertices to Qn does not depend on the order of additions of these vertices if these vertices are causally independent.

Proof. Number the vertices of V. The probability Pv to add a vertex number V is a product of normalization constant and a function of the past set of this vertex. Denote this function by p( V). By assumption, ( V) does not depend on the addition of other vertices of V. Denote by K the cardinality of V. Add the vertices of V to Qn in some order. We have for the probability of this addition

P _ P P P P _ p(A) p(B) p(C) p(X)

pabc...x _PaPbPc...Px _ ^NTTNT2...N+K. (43)

If we change the order of addition of these vertices, we rearrange the functions p(A).. .p(X) in this product. We get the same probability. □

7. Conclusion

The considered dynamics is a consequence of causality, symmetry and normalization. We do not postulate any properties of self-organization. But this dynamics can be the model of quantum gravity if it can describes stable objects (particles). In this case, the algorithm must generates stable repetitive self-organized structures. This is the task for further investigation.

This model is useful for numerical simulation like the previous algorithm [3]. It is necessary to develop the methods to detect and analyze structures during the numerical simulation of the sequential growth. For example, such method is considered in [4]. Another approach to the numerical simulation of self-organization of causal sets is considered in [5].

References

1. A. L. Krugly, A Sequential Growth Dynamics for a Directed Acyclic Dyadic Graph, Bulletin of Peoples' Friendship University of Russia, Series "Mathematics. Information Sciences. Physics" (1) (2014) 124-138, (arXiv: 1112.1064 [gr-qc]).

2. A. V. Koganov, A. L. Krugly, Algorithm of X-Graph Growth and Principles of Physics, Software Products and Systems (3) (2012) 95-102.

3. A. L. Krugly, I. V. Stepanian, An Example of the Stochastic Dynamics of a Causal Set, in: M. D'Ariano, S.-M. Fei, E. Haven, B. Hiesmayr, G. Jaeger, A. Khrennikov, J. Ake Larsson (Eds.), Foundations of Probability and Physics-6 (FPP6), 12-15 June 2011, the Linnaeus University, Vaxjo, Sweden, Vol. 1424 of AIP Conference Proceedings, 2012, pp. 206-210, (arXiv: 1111.5474 [gr-qc]).

4. S. Pissanetzky, The Matrix Model of Computation, in: Proc. 12th World MultiConference on Systemics, Cybernetics, and Informatics (WMSCI), June 29 - July 2, 2008, Orlando, Florida, USA, Vol. IV, 2008, pp. 184-189.

5. T. Bolognesi, Causal Sets from Simple Models of Computation, International Journal of Unconventional Computing 6 (6) (2010) 489-524, (1004.3128 [physics.comp-ph]).

УДК 530.12:539.12

Частный случай последовательного роста x-графа

А. Л. Круглый

Отдел прикладной математики и информатики Научно-исследовательский институт системных исследований РАН Нахимовский пр-т, д. 36-1, Москва, Россия, 117218

Рассмотрена частная модель дискретного пространства-времени в микромире. Она представляет собой ориентированный ациклический диадический граф (x-граф). Диа-дический граф означает, что каждая вершина обладает не больше, чем двумя инцидентными входящими рёбрами и двумя инцидентными выходящими рёбрами. Рассмотрена динамика последовательного роста этой модели. Эта динамика представляет собой стохастическое последовательное добавление новых вершин одна за другой. Вероятности различных вариантов добавления новой вершины зависят от структуры существовавшего x-графа. Доказано, что алгоритм расчёта вероятностей является единственным решением, которое удовлетворяет некоторым требованиям причинности, симметрии и нормировки. Алгоритм последовательного роста может быть представлен тремя шагами. Первый шаг — это выбор добавления вершины в будущее или в прошлое. По определению, вероятности обоих вариантов равны 1/2. Второй шаг — это равновероятный выбор одной вершины с некоторым номером V. Вероятность этого выбора 1/N, где N число вершин в x-графе. Если мы выбрали направление в будущее, то третий шаг — это случайный выбор двух ориентированных маршрутов из вершины номер V. Новая вершина добавляется к концам этих маршрутов. Если мы выбрали направление в прошлое, то третий шаг — это случайный выбор двух обратно ориентированных маршрутов из вершины номер V. Итерационная процедура расчёта вероятностей рассмотрена.

Ключевые слова: причинностное множество, случайный граф, ориентированный граф.

Литература

1. Krugly A. L. A Sequential Growth Dynamics for a Directed Acyclic Dyadic Graph // Вестник РУДН. Серия «Математика. Информатика. Физика». — 2014. — № 1. — С. 124-138. — (arXiv: 1112.1064 [gr-qc]).

2. Коганов А. В., Круглый А. Л. Алгоритм роста x-графа и принципы физики // Программные продукты и системы. — 2012. — № 3. — С. 95-102.

3. Krugly A. L., Stepanian I. V. An Example of the Stochastic Dynamics of a Causal Set // Foundations of Probability and Physics-6 (FPP6), 12-15 June 2011, the Linnaeus University, Vaxja, Sweden / Ed. by M. D'Ariano, S.-M. Fei, E. Haven et al. — Vol. 1424. — 2012. — Pp. 206-210. — (arXiv: 1111.5474 [gr-qc]).

4. Pissanetzky S. The Matrix Model of Computation // Proc. 12th World MultiConference on Systemics, Cybernetics, and Informatics (WMSCI), June 29 - July 2, 2008, Orlando, Florida, USA. — Vol. IV. — 2008. — Pp. 184-189.

5. Bolognesi T. Causal Sets from Simple Models of Computation // International Journal of Unconventional Computing. — 2010. — Vol. 6, No 6. — Pp. 489-524. — (1004.3128 [physics.comp-ph]).

i Надоели баннеры? Вы всегда можете отключить рекламу.