Научная статья на тему 'On the entropy of symbolic image of a dynamical system'

On the entropy of symbolic image of a dynamical system Текст научной статьи по специальности «Математика»

CC BY
58
10
i Надоели баннеры? Вы всегда можете отключить рекламу.
Область наук
Ключевые слова
SYMBOLIC IMAGE / DYNAMICAL SYSTEMS / INVARIANT MEASURE / ENTROPY

Аннотация научной статьи по математике, автор научной работы — Osipenko G.S., Ampilova N.B.

We consider the estimation of the entropy of a discrete dynamical system by using a symbolic image that is a directed graph constructed by means of a finite covering of phase space. Trajectories of the system are coded by paths on the graph. Flows on а symbolic image approximate invariant measures of the system. The maximal metric entropy of a symbolic image is estimated by the logarithm of the maximal eigenvalue of the symbolic image adjacency matrix. There is the flow on which this entropy is achieved. The invariant measure of the maximal entropy is estimated by using this flow.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «On the entropy of symbolic image of a dynamical system»

Динамические системы, 2019, том 9(37), №2, 116-132

MSC 2010: 37A05, 37B10

On the entropy of symbolic image of a dynamical system1

G. S. Osipenko*, N. B. Ampilova**

* Sebastopol Branch of Lomonosov Moscow State University, Sebastopol 299001 E-mail: [email protected]

**Saint-Petersburg State University, Saint-Petersburg 199034. E-mail: [email protected]

Abstract. We consider the estimation of the entropy of a discrete dynamical system by using a symbolic image that is a directed graph constructed by means of a finite covering of phase space. Trajectories of the system are coded by paths on the graph. Flows on а symbolic image approximate invariant measures of the system. The maximal metric entropy of a symbolic image is estimated by the logarithm of the maximal eigenvalue of the symbolic image adjacency matrix. There is the flow on which this entropy is achieved. The invariant measure of the maximal entropy is estimated by using this flow.

Keywords: symbolic image, dynamical systems, invariant measure, entropy.

1. Symbolic image of a dynamical system

Let f: M ^ M be a homeomorphism of a compact manifold M generating a discrete dynamical system

Xn+1 = f (xn), (1.1)

and p(x, y) be a distance on M. In what follows we use the concept of symbolic image of a dynamical system [16], which brings together symbolic dynamics [1, 13] and numerical methods [9]. Let C = {M (1),..., M (n)} be a finite closed covering of a manifold M. The set M(i) is called cell with index i.

Definition 1. [15] Symbolic image of the dynamical system (1.1) for a covering C is a directed graph G with vertices {i}corresponding to cells {M(i)}. The vertices i and j are connected by the edge i ^ j iff

f (m (i)) п M (j )=t

Symbolic image is a tool for a space discretization and graphic representation of the dynamic of a system under study, which allows the obtaining useful information about the global structure of the system dynamics. Symbolic image depends on a covering C. The existence of an edge i ^ j guaranties the existence of a point x E M(i) such that

xThis research was carried out with the financial support of the Russian Foundation for Basic Research (grant А no. 19-01-00388)

© G. S. OSIPENKO, N. B. AMPILOVA

f (x) G M(j). In other words, an edge i — j is the trace of the mapping x — f (x), where x G M(i), f (x) G M(j). If there isn't an edge i — j on G then there are not the points x G M(i) such that f (x) G M(j).

We do not place special restrictions on a covering C, but basing on the theorem about the triangulation of a compact manifold [18] we may without loss of generality assume that cells M(i) are polyhedrons intersecting on their boundary disks. In practice M is a compact in Rd, and M(i) are cubes or parallelepipeds. Let C be a covering of M by polyhedrons intersecting on their boundary disks. In what follows we also use a measurable partition C*, such that a boundary disk belongs only one of adjoining cells. We assume that cells-polyhedrons are closures of their interiors.

Definition 2. A vertex of a symbolic image G is said to be recurrent if there is a periodic path passing through it. The set of recurrent vertices is denoted by RV. The recurrent vertices i and j are called equivalent if there exists a periodic path passing through i and j.

Thus, the set of recurrent vertices RV is split into equivalence classes {Hk}. In the graph theory such classes are called strong connectivity components.

Let

diam M(i) = max(p(x,y) : x,y G M(i))

be the diameter of a cell M(i) and d = diam(C) be the maximum of the diameters of the cells. The number d is called the diameter of the covering C.

A directed graph G is uniquely defined by its adjacency matrix (matrix of admissible transitions) n. The matrix n = (nij) has sizes n x n, where n is the number of vertices of G, and nij = 1 iff there exists the edge i — j, else nij = 0. Hence an i-th row in n corresponds to the vertex i (cell M(i)), and on the place j in this row there is 1 or 0 depending on the existence (or nonexistence) of nonempty intersection f (M(i)) and M(j). Matrix of admissible transitions depends on the numbering of vertices (cells of the covering), so that a change of numeration leads to a change of matrix n. Note that there exists a numeration transforming the matrix of admissible transitions to a canonical form.

Proposition 1. [1] Vertices of a symbolic image G may be numbered such that the adjacency matrix has the form

( (П1)

П

V 0

(П)

0

\

(П) /

where every diagonal block nk corresponds either an equivalence class Hk of recurrent vertices or a nonrecurrent vertex and consists of one zero. Under diagonal blocks are only zeroes (upper triangular matrix)

0

2. Entropy

In 1865 R. Clausius [4] introduced the most important in thermodynamics concept entropy. To explain the irreversibility of macroscopic states L. Boltzmann in 1872 [3] first introduced statistical approach in thermodynamics : he proposed to describe a state of a system by using its microstates. The Boltzmann entropy S is statistical entropy for the equiprobable distribution of a system over P states, it is defined as

5 = k log(P ),

where k is the Boltzmann constant.

In 1948 C. Shannon [19, 20] introduced the notion of capacity (C) for an information channel as follows

C = lim l°g>N (T ),

t^ T

where N(T) is the number of admissible signals for the time T. He also defined information entropy as follows

h = Pi lo§2Pi,

where pi is a probability of i-th signal (message), i E 1,... ,n, and n is the number of signals. A. N. Kolmogorov in 1958 [11] introduced entropy in the theory of dynamical systems. Entropy is a fine invariant of a dynamical system, it may be interpreted as a measure of the system chaoticity. Comprehensive information on entropy in dynamical systems is given in [6, 10]. It turns out that entropy characteristics may be obtained both for a system described analytically and for its phase portraits. The application of such characteristics to digital image analysis is given in [2].

Motivation Consider a discrete dynamical system xn+\ = f (xn) on a compact manifold M, where f : M ^ M is a homeomorphism. Let C = {M(1),..,M(n)} be a finite covering of M and the sequence {xk = fk(x),k = 0,...N — 1} be the N-length part of the trajectory of a point x. The covering C generates a coding of this part via a finite sequence £(x) = {ik, k = 0,... N — 1}, where xk E M(ik). In other words, ik is the index of the cell from C which contains the point xk = fk(x). Generally speaking, the mapping xk ^ ik is multivalued. The sequence £ = {ik} is said to be (admissible) encoding of the trajectory {xk = fk (x)} with respect to the covering C. Assume that we know all admissible N-length encodings, and there is a need to predict subsequent p-length encodings, i. e. to find admissible encodings of length N + p.

Let the number of admissible encodings K(N) grows exponentially depending on N. We estimate the rate of growth of encodings by the number

ft = lim K(N' ■ (2.1)

where b may be any real number greater than 1. The bases b = 2 (following to Shannon) or b = e are in common use. The existence of the limit in (2.1) follows from the Polya lemma [1, 13].

Lemma 1 ([13], p. 103). If a sequence of non-negative numbers an satisfies the inequality

an+m — an + am,

then there exists lim —.

n^x n

For the number of admissible encodings we have

K (n + m) — K (n)K (m),

hence for the sequence an = log6K(n) there exists the limit (2.1). Thus, for the number of encodings K(N) we obtain the estimation

K (N) - BbhN,

where B is a constant. If h = 0 then

K(N + pK bhp

K (N) '

This relation means that for any N the uncertainty of future encodings grows with the exponent hp regardless the knowledge of previous encodings.

If the growth of the number of different encodings is not exponential (i. e. h = 0), for example as

K(N) - BNa,

where A is a positive number (may be large), then

K(N + P) - (1 + P)A — 1,

K (N) [ N} '

when N — to. In other words, the uncertainty of the future decreases when the length N of known encodings increases.

Thus, if the growth of the number of different encodings is exponential, the uncertainty does not depend on N, in other case it decreases as N increases. Value h may be interpreted as a characteristics of uncertainty (chaoticity) of the dynamic of the system considered.

Topological entropy. Let f be a continuous mapping defined on a manifold M and C = {M(1),..., M(n)} be an open covering of M. For an integer positive number N consider a sequence

U = (x>i(x>2 ' ' ' MN,

where is a number from 1 to n. Construct the intersection of the form

\M (u) = M (ui) n f-1(M (U2)) f -N+1(M (un )), (2.2)

which is an open set. The admissible encoding u corresponds to the nonempty intersection M(u), i.e. there exists x G M(u1) such that fk(x) G M(uk+1). The

sequence u codes the segment of the trajectory {fk (x), k = 1, 2,...,N}. Consider all the admissible N-lentgh encodings {u} and the collection of sets CN = {M(u)}, which is an open covering. Choose in CN a minimal by the number of elements (denoted further by \CN\) finite subcovering CN.

Then according to the Polya lemma there exists the limit

H(C) = lim ^. (2.3)

N N

Definition 3. The number

h(f) = sup H (C),

C

where supremum is taken over all open coverings C, is called topological entropy of the mapping f: M — M.

It is easy to see that there is little point in using this definition for practical calculation of the entropy, so we consider some other methods.

A covering C2 is said to be refined in a covering C1, if any A G C2 lies in a set B G C1. A sequence of open coverings Cn is called exhaustive if for any open covering C there exists the number n* such that the covering Cn is refined in C for n > n*.

Proposition 2 ([1] p. 122). (1) If Cn is a sequence of open coverings with diameters

dn = max diamA

A&C,

n

tending to zero, then Cn is an exhaustive sequence. (2) The entropy of the mapping f is calculated as follows

h(f) = lim H(Cn).

Consider coverings C1 and C2, and construct for each of them nonempty intersections of the form (2.2). Denote the obtained collections of sets by CN and CN respectively. In each collection choose the minimal (by the number of elements) subcovering, and denote them CN1 and CN2.

Proposition 3. If C2 is refined in C1, then

\CN 1 \ — \CN2\i

where \CNi\ denotes the number of elements in the set considered.

Proof. If A1 C B1 and A2 C B2, then A1 n f-1(A2) lies in Bx n f-1B). By the same way one can prove that elements of CN are in corresponding elements of CN. Consider CN1 and CN2. Take from CN all the elements which contain corresponding elements of CN2, and form the covering CN 1.

Then \CN 1\ — \CN 1\ — \CN2\, because CN1 is a minimal subcovering for CN. The proposition is proved.

Corollary 1. Assume that C2 is refined in C1, and the numbers H(C1), H(C2) are calculated by (2.3). Then

H(Ci) < H(C2).

3. Entropy of a symbolic image

Let G be a graph with the adjacency matrix n. Denote by bn the number of admissible n-length paths on G.

Definition 4. The number

h(G) = lim ^

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

n^x n

is called the entropy of the graph G.

Remember that if (nn) is the power of n then the element (nn)ij equals the number of admissible n-length paths from i to j and

bn = Y,(nn)

n\

г] ■

г]

Subdivision process. We will apply a process of adaptive subdivision to coverings and construct a sequence of symbolic images. At first, let us consider a main step of the process a subdivision of covering. Let C = {M(i)} be a covering of M and G be the symbolic image with respect to C. Suppose a new covering NC is a subdivision of C. It is convenient to designate cells of the new covering as m(i,k). This means that each cell M(i) is subdivided by cells m(i, k), k = 1, 2,... which form a subdivision of the cell M(i), i.e.,

[jm(i,k) = M (i).

k

Denote by NG the symbolic image for the new covering NC. The vertices of the new symbolic image are denoted by (i,k). It is possible that some cells will not be subdivided, i.e. m(i, 1) = M(i), and the vertex i in G is the vertex (i, 1) in NG. The subdivision just described generates a natural mapping s from NG onto G which takes the vertices (i, k) onto the vertex i. From f (m(i, k)) fl m(j, l) = 0 it follows that f (M(i)) f M(j) = 0, so the edge (i, k) — (j, l) is mapped onto the edge i — j. Hence, the mapping s takes the directed graph NG onto the directed graph G.

Theorem 1. Let Cn be a sequence of subdivisions of a closed covering, such that cells are polyhedrons intersecting on boundary disks, and diameters dn of Cn tend to zero. Denote by Gn the symbolic images constructed for a mapping f : M — M in accordance with the sequence Cn. Then for the entropy of f the following inequality holds

h(f) < lim h(Gn).

Proof. Let Cn = {M(z)} be a closed covering from the sequence described above, and Gn be a symbolic image of the mapping f constructed according to Cn. Consider the set PN of encodings of ^-length segments of trajectories. It is obvious that PN is not greater than the number of admissible ^-length paths (denoted by bN) on Gn,.

Hence the number \CN\ of nonempty intersections of the form

M (u) = M (wi) n f-1(M (u2)) n-^n f -N+1(M (un )) is not greater than bN. Thus,

H(Cn) < h(Gn).

If C is an open covering then there exists the number n* such that the covering Cn is refined in C for n > n*. Then in accordance with Proposition 3 we have

H(C) < H(Cn) < h(Gn).

Now consider an exhausting sequence of open coverings {Cm}. Let n(m) be the number n* constructed for the covering Cm, then

H(Cm) < H(Cn) < h(Gn), where n > n(m). If m ^ <x>, then according to Proposition 2 we have

h(f) = lim H(Cm) < lim h(Gn).

n—^^o

Remember that a matrix A(n x n) is called decomposable if it admits an invariant subspace with dimension less than n, and a matrix A is called nonnegative (positive) if it has nonnegative (positive) elements. If for a nonnegative matrix A there is an integer s > 0 such that all the elements of As are positive, then A is called primitive. In particular the matrix of admissible transitions n is nonnegative. It is nondecomposable if the symbolic image consists of one class of equivalent recurrent vertices.

Theorem 2. (Perron-Frobenius) [7, 13]

• If A is a decomposable nonnegative matrix then it has an eigenvector e with positive coordinates and the eigenvalue X with multiplicity 1, and the other eigenvalues fi satisfy the inequality < X.

• If A is a decomposable nonnegative matrix and \ f\ < X, then A is primitive.

• If A is a decomposable nonnegative matrix and it has h > 1 eigenvalues v)hich are equal in 'modulus X, then A is not primitive, and by an agreed renumeration of rows and columns it may be transformed to the form

( 0 Ai2 0 ••• 0 \

0 0 A23 ••• 0

0 0 0 ••• Ah-ih \ Ahi 0 0 • • • 0

where Aij are square blocks, and Ah consists of h primitive blocks.

Theorem 3. The entropy of the graph G is equal to the logarithm of the maximal eigenvalue of the adjacency matrix

h(G) = ln A.

Proof. 1. Consider the case when G consists of one class of equivalent recurrent vertices. Let e be a positive eigenvector for the maximal eigenvalue A, i.e.

ne = Ae.

In the coordinate form we have

Y,(nnh ej = Anei. (3.1)

j

Let c = min{e,i} and d = max{ei,}. In accordance with the Perron-Frobenius theorem c > 0. Then the following inequalities hold

c£(nn)ij < £(nn)ijej < dAn. jj

It follows that

d

Y.(nnh < j

for any i. Summing by i we obtain

dr

bn = Y,(nn)ij < -\n

c

г]

where r is the number of rows in the matrix n. It follows from (3.1) that cAn < Anei = Y,(nnbej < d ^(nn)j < d ^(nn)i

ij^j < u, / ± )г] < U> / ^J- )г]-

] j гj

Hence we have the estimation

c

Xn < £ (П^ = bn

d

г]

and the inequalities

c n dr

dA

The required equality follows from it.

~Лп < bn <— \n ■ dc

2. Consider the case when there are several classes of equivalent recurrent vertices, i.e the matrix n is decomposable. According to Proposition 1, by a renumbering of vertices this matrix may be transformed to the form

/ (П1)

П

\

V 0

(n )

0

№) /

where each diagonal block nk corresponds to either one of the classes of equivalent recurrent vertices Hk or some non-recurrent vertex, and consists of one zero. Under diagonal blocks are zeroes. Each class Hk has the entropy

h(Hk) = ln Xk,

where Xk is the maximal eigenvalue of nk. By the definition the entropy of a symbolic image G equals

h(G) = lim ^.

n—<x n

Consider an admissible n-length path u. Assume that on G there are s classes of equivalence Hk. The path u passes both through the vertices from Hk and non-recurrent vertices not belonging to these classes. Denote by uk the parts of u which lie in the class Hk. If we delete from u all the uk it will contain only different paths 01 passing through non-recurrent vertices. Thus, u is the sum uk and 07. Combine all the paths 07 into a sequence 0, which is not necessarily is an admissible path. Let K be the number of non-recurrent vertices in G. The sequence 0 contains nonrecurrent vertices without repetition. Hence the number of the sequences 0 is not greater than the number of permutations of K elements, i.e. K!.

Denote by n(k) the length of the path uk from Hk. Then n(1) + n(2) + • • • + n(s) < n. According to item 1 for every class Hk there is a number d such that the number bk(n(k)) of different n(k)-length paths uk is estimated as follows

bk(n(k)) < dXnk(k) < dXn(k),

where X = max Xk. Then for the number of different paths uk lying in u we have the estimation

Y[bk(n(k)) < dsXn(1)+n(2)+"+n(s) < dsXn. k

Summing the above results we have

bn < K!dsXn.

Thus, we obtain an upper estimation for the entropy of G:

h(G) < lim - ln(K!dsXn)=ln X.

n—<x n

0

Prove the opposite inequality. Note that the number of admissible paths on G is greater than the number of admissible paths in a class Hk. Then h(G) > h(Hk) = ln Ak for any k, which gives the low estimation

h(G) > ln A.

Hence we have

h(G) = ln A,

and the proof is completed. 4. Flows on a symbolic image

Let f: M ^ M be a homeomorphism of a compact manifold M. A measure f defined on M is said to be f-invariant, if for any measurable set A С M the equality

f(f-1(A)) = f(A) = f(f (A))

holds. In what follows we assume that all measures considered are the Borel ones. The Krylov-Bogoliubov theorem [12, 10] guaranties the existence of an invariant measure f which is normed on M: fi(M) = 1. Denote by M(f) the set of all f-invariant normed measures . This set is a convex closed compact in weak topology (see [14], p.511). The convergence fn ^ f in this topology means that

/ фdfn ^ фй^ Jm Jm

for any continuous function ф : M ^ R.

To understand how a distribution of a measure may appear on a symbolic image, consider the following observation. Assume that there exists a f-invariant normed measure f on M, and the cells of a covering C are polyhedrons intersecting by boundary disks. Construct a measurable partition C* = {M*(«)} such that a boundary disk belongs to one of adjoining cells. Then, to every edge i ^ j of a symbolic image G we can assign the measure

ml3 = ff(M*(i) n f-1 (M*(j))) = f(f(M*(i)) n M*(j)), (4.1)

where the last equality follows from the invariance of f . Besides that, the invariance of f leads to the equalities

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

£ mkl = Y, f(f (M*(k)) П M *(i))) = f(M*(i)) = к k

Y f(M*(i) n f-1(M*(j))) = Ymj. j j

The value Y1 k mki is called the flow incoming in the vertex i, and the mij is called the flow outcoming from i. The equality

"Y^mki = ^ mij (4.2)

к

may be interpreted as Kirchoff's law: for any vertex the incoming flow equals the outcoming one. Furthemore, we have

Y,mi3 = ß(M ) = 1. (4.3)

Hj ij

It means that the distribution mij is normed (probabilistic). Thus, a f-invariant measure f generates on a symbolic image a distribution mij which satisfies the conditions (4.2) and (4.3). The above reasoning leads to the following definition.

Definition 5. Let G be a directed graph. The distribution {mij} on edges {i ^ j} such that

• mij > 0;

• E ij mij = 1;

• for any vertex i

mij

k j

is called flow on G.

The last property may be called the invariance of a flow. The norming condition may be written as m(G) = 1, where the measure of G is the sum of measures of all edges. Sometimes in the graph theory for such a distribution the term "closed flow" is used.

For the flow {mij} on G we may define the measure of a vertex i as

m

к

Y,mki = Z/

Then i mi = m(G) = 1.

Thus, a f-invariant measure generates a flow on a symbolic image. Now we consider the inverse construction. Let on a symbolic image G a flow m = {mij} be given, then we can construct the measure f on M as follows

= Y (Z^', (4.4)

mi(v(A n M(i)) v(M (i))

where A is a measurable set. Here v is a normed on M Lebesgue's measure, and on the assumption v(M(i)) = 0. In this case the measure of a cell M(i) coincides with

the measure of the vertex i: j(M(i)) = mi. As v is the Lebesgue measure, the measure of boundary disks is equal to zero and the measure of a cell does not depend on the measure of its boundary. In general, the constructed measure j is not /-invariant. But it is an approximation to an invariant measure in the sense that j converges in weak topology to an invariant measure if the diameter of the covering tends to zero.

Theorem 4. [17] Let on a sequence of symbolic images {Gt} of a homeomorphism / a ssequence of flows {mt} be defined, and the maximal diameter dt of coverings tends to zero ujhen t ^ Then

• there exists the .subsequence of measures jtk (constructed according to (4-4)) which converges in weak topology to a /-invariant measure j;

• if a subsequence of measures jtl converges in ujeak topology to a measure j*, then j* is /-invariant.

Theorem 5. [17] For any neighborhood (in ujeak topology) U of the set M(/) there is a positive number d0 such that for any covering C with the diameter d < d0 and any flow m on a symbolic image G with respect to C, the measure j constructed according to (4-4) by m, lies in U.

5. Metric entropy

Let j be a normed invariant measure of a homeomorphism / : M ^ M and C = {Mi, M2, ■ ■ ■ , Mm} a measurable partition of the manifold M.

Definition 6. The entropy of the partition C is defined as

H (C ) = -Y, j(Mi)ln j(Mi).

i

Construct a covering CN which consists of nonempty intersections of the form

Ai! n /-i(Ai2) n-^n /-N+i(MiN).

If such an intersection is nonempty then the sequence ii,i2, ■ ■ ■ iN is admissible coding with respect to the covering C.

The metric entropy of / for the covering C is defined as

H(/,C) = lim ^H(CN).

w' ; n ^ N v ;

The existence of the limit follows from the Polya lemma. Definition 7. The entropy of / for an invariant measure j is defined as

h(/,j) = sup H(/, C),

C

where sup is taken over all measurable finite partitions.

The connection between topological and metric entropy is given by the following theorem.

Theorem 6. [5, 8] The topological entropy of a homeomorphism f is the least upper bound of metric entropies

h(f ) = sup h(f,^).

6. Stochastic Markov chains

Stochastic Markov chain [7, 13] is defined by a set of states of a system {i = 1, 2,...,n} and the matrix of transition probabilities Pj from a state i to state j. Such a matrix is called stochastic if it satisfies the conditions Pj > 0 and j Pj = 1 for every i. A probabilistic distribution p = (pi,p2,... ,pn), EiV% = 1 is said to be stationary if

( PU P12 ■■ ■ P1n \

(P1,P2^- ■ ,Pn) P21 P22 ■ ■ ■ P2n

\ Pn1 Pn2 ■ P nn /

(P1,P2, ■ ■ ■ ,Pn),

i.e. p is a left eigenvector of P.

We show that there is a one-to-one correspondence between a Markov chain and a flow on a graph in which vertices correspond to the states with positive measure. Let m = {mj} be a flow on a graph G. The measure of a vertex i equals mi = mij = k mki. If mi = 0 then the vertex {i} is necessary recurrent. It is easy to verify that any flow m = {mij} on G generates a stochastic Markov chain in which the states are vertices with nonzero measures, and the transition probabilities from i to j are calculated as

P = mn Pij ■ mi

In this case the stochastic matrix P = (mij /mi) has the stationary distribution coinciding with the distribution of the measure m over the vertices (m\,m2, ■ ■ ■, mn). This follows from the equality

/ mn mi2 mi mi

(mi,m2,... ,mn)

mi m-21 m2

mi mi2 m2

V

mni mn2 mn m„

min \ mi m2n

m2

m'nn

mn

(mi,m2,... ,mn).

/

Thus, any flow m = {mij} on a graph G generates a stochastic Markov chain for which the distribution of the measure (mi) on vertices is stationary.

Now we prove the inverse fact: for any stochastic matrix P = (Pij) and its stationary distribution p = (pi) there exists a flow m = {mij} on a graph G for which the

p

distribution of the measure on vertices coincides with the stationary distribution, i.e. mi = pi.

Actually, let P be a stochastic matrix and pP = p. Consider a graph G which has n vertices {i}, and the edge i ^ j there exists if Pij > 0. Construct the distribution on edges mij = Pijpi and show that the distribution is a flow on G. As P is stochastic then j Pij = 1 for any i. Hence

Y mij = Y Pij pi = pi^h Pij = pi' j j j

As pP = p then Y1 kpkPki = pi, so

Y,mki = Y, pkPki = pi = Y mij,

k k

i.e for the distribution mj the Kirchoff law holds. Moreover,

Eij m = E i pi = 1.

From the above it follows that the construction of a flow on a graph results in obtaining a Markov chain.

7. Flow entropy

The developed technique may be applied to estimate metric entropy. Let for a mapping f and a covering C a symbolic image G and a flow m = {mij} be constructed. Any flow m may be considered as the approximation to an invariant measure ¡i, if the diameter of C is small enough. The flow m on G generates the Markov chain in which the states coincide with vertices of G, and transition probabilities are defined as

mij pij = — ■ mi

The matrix P = (pij) has the stationary distribution (m\,m2, ■ ■ ■ ,mn) for which entropy is calculated by the formula (see [13], p. 443)

hm = - Y mi Y pij ln pij ■

ij

Substituting pij = mij/mi we obtain

hm = - V mi V — ln( = - V mij ( — ) =

mi mi mi

i j ij

— m^ ln m^ + ^^ mj ln mi = — ^^ mj ln mj + ^^ mi ln m^

ij ij ij i

By this means entropy can be calculated by the flow mij as

hm = mij ln mij + Y mi ln mi■ (7.1)

ij

The last equality allows estimating the entropy of f for the invariant measure ¡ , where the flow m is an approximation of ¡ .

8. Flow with maximal entropy

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Let n be the matrix of admissible transitions for a graph G. Our objective is to construct the flow which has maximal entropy among all the flows on G. As any flow is grouped on a component of recurrent vertices, it may be thought that G consists from one component.

Theorem 7. There is a flow m on G such that:

hm = h(G) = ln X^

Proof. 1. Eigenvalues of any real matrix A = (aij) coincide with the eigenvalues of the transposed (conjugate) matrix A*. Really, as det A = det A*, then

det(A - XE) = det(A - XE)* = det(A* - XE)■

Hence to an eigenvalue X of A corresponds the conjunctive eigenvalue X of A*. The roots of a real characteristic polynomial are either real or complex-conjugate, hence the eigenvalues of the matrices A and A* coincide.

2. Let A be the matrix of admissible transitions of a graph G and X be the maximal eigenvalue from the Perron-Frobenius theorem. Then for A there exists a left eigenvector e with nonnegative coordinates ei, i ei = 1, such that

eA = \e, A*e = Xe.

Hence for every i we have which leads to the equality

YaJiej = Xei, (8.1)

ajiej

^ Xe

j

for every i. Hence a matrix of the form

P

(pij = aXj

Xei

is the stochastic matrix for which vector e is a stationary distribution:

eP = e■

1

The distribution on edges i ^ j defined by

m,ij Pij ei

X

is the flow m on the graph G such that the measure mi of the vertex i equals ei. The entropy of m is calculated by the formula

hm = — ^^ mij ln mij + ^^ mi ln \

Hj Щ "vij I / mi ln mi

ij i

Hence

hm = — £ jj ln jj + E ei ln ei.

XX

ij i

aij = 1. Thus we obtain

Here we assume that 0ln0 = 0. That means that the sum is taken over i,j for which obtain

hm = — ^ -A1 (ln aij + ln ei - ln A) + ^ ei ln ei =

ij i

j) ln A - £(£ jj) ln ei + £ ei ln ei =

i j i j i

ln A ^^ ei — ^^ ei ln ei + ^^ ei ln ei = ln A.

ajiej

References

1. Alekseev, V. Symbolic dynamics, 11 mathematical school, Institut matematiki AN SSSR, Kiev, 1976. (in Russian)

2. Ampilova, N., Soloviev, I. Entropies in investigation of dynamical systems and their application to digital image analysis. Journal of Measurements in Engineering, 6, No. 2, 107-118 (2018). https://doi.org/10.21595/jme.2018.19891

3. Boltzmann, L. Sitzungsber. Kaiserl. Akad. Wiss. 66 (2) 275 (1872).

4. Clausius, R. Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Annalen der Physik und Chemie 125, No.7, 353-400 (1865).

5. Dinaburg, E. I. On the relations among various entropy characteristics of dynamical systems. Math. USSR, Izv. 5(1971), 337-378 (1972).

6. Downarowicz, T. Entropy in dynamical systems. Cambridge: CUP, 2011.

7. Gantmacher, F. R. The theory of matrices. Vol. 1, 2. Transl. from the Russian by K. A. Hirsch. Reprint of the 1959 translation. Providence, RI: AMS Chelsea Publishing, 1998.

8. Goodman, T. Relating topological entropy and measure theoretic entropy. Bulletin of London Math. Soc. 3, No. 2, 176-180 (1971).

9. Hsu, C. S. Cell-to-Cell Mapping, N.Y.: Springer-Verlag, 1987.

10. Katok,A., Hasselblat, B. Introduction to the Modern Theory of Dynamical Systems, Cambrige: Cambridge University Press, 1995.

11. Kolmogorov, A. New metric invariants of transitive dynamical systems and automorphisms of the Lebeg spaces, Doklady AN SSSR, 119, No.5, 861-864 (1958).

12. Kryloff, N., Bogollouboff, N. La theorie generale de la mesure dans son application a l'etude das systemes dynamiques de la mecanique non lineaire, Ann. Math., 38, No.1, 65-113 (1937).

13. Lind, D, Marcus, B. An introduction to symbolic dynamics and coding. Cambridge: Cambridge University Press, 1995.

14. Nemytskii, V. V. and Stepanov, V. V. Qualitative Theory of Dynamical Systems, Princeton University Press, 1960.

15. Osipenko, G. On symbolic image of a dynamical system. Kraevie zadachi (Sb. trudov), Perm (pp. 101-105) 1983. (in Russian)

16. Osipenko, G. Dynamical systems, Graphs, and Algorithms. Lectures Notes in Mathematics, 1889, Berlin: Springer, 2007.

17. Osipenko, G. Symbolic images and invariant measures of dynamical systems. Ergodic Theory and Dynamical Systems 30, 1217-1237 (2010).

18. Prasolov, V. V. Elements of Combinatorial and Differential Topology. Graduate Studies in Mathematics, Vol. 74, 2006.

19. Shannon, C. A Mathematical Theory of Communication. Bell System Technical Journal 27, No.3, 379-423 (July 1948). doi:10.1002/j.1538-7305.1948.tb01338.x. hdl:11858/00-001M-0000-002C-4314-2.

20. Shannon, C. A Mathematical Theory of Communication. Bell System Technical Journal. 27, No.4, 623-666 (October 1948). doi:10.1002/j.1538-7305.1948.tb00917.x. hdl:11858/00-001M-0000-002C-4314-2.

Получена 02.06.2019

i Надоели баннеры? Вы всегда можете отключить рекламу.