Научная статья на тему 'D. C. programming approach to Malfatti’s problem'

D. C. programming approach to Malfatti’s problem Текст научной статьи по специальности «Математика»

CC BY
169
16
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
D.C. PROGRAMMING / GLOBAL OPTIMALITY CONDITIONS / MALFATTI''S PROBLEM / CONVEX MAXIMIZATION / LOCAL SEARCH ALGORITHM / D.C. CONSTRAINT / GLOBAL OPTIMIZATION / MALFATTI CIRCLES / LINEARIZED PROBLEM / D.C. MINIMIZATION / D.C. ОПТИМИЗАЦИЯ / УСЛОВИЯ ГЛОБАЛЬНОЙ ОПТИМАЛЬНОСТИ / ЗАДАЧА МАЛЬФАТТИ / ВЫПУКЛАЯ МАКСИМИЗАЦИЯ / АЛГОРИТМ ЛОКАЛЬНОГО ПОИСКА / D.C. ОГРАНИЧЕНИЕ / ГЛОБАЛЬНАЯ ОПТИМИЗАЦИЯ / КРУГИ МАЛФАТТИ / ЛИНЕАРИЗОВАННАЯ ЗАДАЧА / D.C. МИНИМИЗАЦИЯ

Аннотация научной статьи по математике, автор научной работы — Enkhbat Rentsen, Barkova Mariya V., Sukhee Batbileg

In previous works R. Enkhbat showed that the Malfatti's problem can be treated as the convex maximization problem and provided with an algorithm based on Global Optimality Conditions of A. S. Strekalovsky. In this article we reformulate Malfatti’s problem as a D.C. programming problem with a nonconvex constraint. The reduced problem as an optimization problem with D.C. constraints belongs to a class of global optimization. We apply the local and global optimality conditions by A. S. Strekalovsky developed for D.C programming. Based on local search methods for D.C. programming, we have developed an algorithm for numerical solution of Malfatti's problem. In numerical experiments, initial points of the proposed algorithm are chosen randomly. Global solutions have been found in all cases.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

D.C. ПОДХОД К РЕШЕНИЮ ЗАДАЧИ МАЛЬФАТТИ

В предыдущих работах Р. Энхбат показал, что проблему Малфатти можно рассматривать как проблему выпуклой максимизации и решать алгоритмом на основе глобальных условий оптимальности А. С. Стрекаловского. В этой статье мы переформулируем проблему Малфатти как проблему D.C. программирования с невыпуклым ограничением. Приведенная проблема, как проблема оптимизации с D.C. ограничениями, принадлежит классу глобальной оптимизации. Мы применяем локальные и глобальные условия оптимальности А. С. Стрекаловского, разработанные для D.C. программирования. Основываясь на методах локального поиска для D.C. программирования, мы разработали алгоритм для численного решения задачи Малфатти. В численных экспериментах исходные точки предлагаемого алгоритма выбираются случайным образом. Во всех случаях найдены глобальные решения.

Текст научной работы на тему «D. C. programming approach to Malfatti’s problem»

УПРАВЛЯЕМЫЕ СИСТЕМЫ И МЕТОДЫ ОПТМИЗАЦИИ

Y^K 519.853

DOI: 10.18101/2304-5728-2018-4-72-83

D.C. PROGRAMMING APPROACH TO MALFATTI'S PROBLEM

© Enkhbat Rentsen

Ph.D., Prof.,

Business School of National University of Mongolia

46A/523, Ikli Siirgiiuliin gudamj-1, Ulaanbaatar 210646, Mongolia

E-mail: enklibat46@yahoo.com

© Mariya V. Barkova

Researcher,

Matrosov Institute for System Dynamics and Control Theory SB RAS 134 Lermontova St., Irkutsk 664033, Russia E-mail: mariabarkovamaiipgmail.com

© Batbileg Sukhee

Researcher,

School of Applied Science and Engineering of National University of Mongolia

46A523, Ikh Surguuliin gudamj-1, Ulaanbaatar 210646, Mongolia E-mail: batbileg@,seas.num.edu.mn

In previous works R. Enkhbat showed that the Malfatti's problem can be treated as the convex maximization problem and provided with an algorithm based on Global Optimality Conditions of A. S. Strekalovsky. In this article we reformulate Malfatti's problem as a D.C. programming problem with a nonconvex constraint. The reduced problem as an optimization problem with D.C. constraints belongs to a class of global optimization. We apply the local and global optimality conditions by A. S. Strekalovsky developed for D.C programming. Based on local search methods for D.C. programming, we have developed an algorithm for numerical solution of Malfatti's problem. In numerical experiments, initial points of the proposed algorithm are chosen randomly. Global solutions have been found in all cases. Keywords: D.C. programming; global optimality conditions; Malfatti's problem; convex maximization; local search algorithm; D.C. constraint, global optimization, Malfatti circles; linearized problem; D.C. minimization.

Introduction

In 1803, Gian Francesco Malfatti (1737-1807) of the University Ferrara posed the problem of determining the three circular columns of marble of possibly different sizes which, when carved out of a right triangular prism, would have the largest possible total cross section [16]. This is equivalent to finding the maximum total area of three circles which can be packed inside a right triangle of any shape without overlapping. Malfatti gave the solution as three circles (the Malfatti circles) tangent to each other and to two sides of the triangle.

In [14], it was shown that the Malfatti circles were not optimal. The most common methods used for finding the best solutions to Malfatti's problem were

algebraic and geometric approaches [1, 13, 11]. In 1994 Zalgaller and Los [27, 15] proved that the greedy arrangement solves the Malfatti's problem Melissen conjectured in [17]: the greedy arrangement has the largest total area among of n (// > 4) non-overlapping circles in a triangle.

In papers [8] and [9], Malfatti's problem has been examined from a view point of global optimization theory and algorithm. We deal with Malfatti's problem first formulated in [16] reducing it to D.C. programming. In particular, the problem was treated as a convex maximization problem. An algorithm based on global optimality conditions given in [23] has been applied to solving Malfatti's problem.

1. Preliminaries

We introduce the following sets. A triangle set is given by

D = {x g R2 | {a1,x) < bf, a' g R2, 6,. e R, i = U}, and denoted by Bi circle with a center c' g R2 and a radius r g R Bi=B(ci,ri) = {xGR21 ||x-c' ||< rf},i = U. Theorem 1. [8] Bi cz I) if and only if

(a',c') + rt || ||< i = 1^3. (1)

Proof. Necessity. Let y g B(cl ,rt) and y gD. The point y g B(c\r) can be easily presented as y = c' +/;/?, h gR" , p|| < 1. It follows from the condition v e I) that (a',y)<bj,i = 1,3, or, equivalently, (a',c') + rt(a',h) < bt, / = 1,3 V/? g R3. Hence, we have

(d, c') + r max (a', h) < b,, i = 1,3.

II*II<1

Sufficiency. Let the condition (1) be satisfied, and on the contrary, assume that there exists y g II such that y <t I). Clearly, there exists h e R" such

that y = c +rh, h < I . Since y <t 1), there exists j g {1,2,3} for which

<aJ, y) > bj or <aj, c + rfh) = (aJ\c) + r <aj, h) > bj.

2. Malfatti's problem and optimization approach

Denote by c1 = (c\ ,c\) — coordinates of the first circle, c2 = (c2 ,c2) —

coordinates of the second circle and c = (c^, c\) — coordinates of the third circle. f\, r2, r3 — their corresponding radii, and x = (c1, c2, c3, rx, r2, r3) . Notice, that non-overlapping condition of circles, i.e.

/'///(Jl r-) lij) = 0 V/^y, /,7=U, can be formulate using following inequalities:

(7;.+r.)2<|c''-^|2, /,7 = 1,3. (2)

Then Malfatti's problem can be reformulated as the following optimization problem:

3

= TT^/y Î max, (3)

7=i x _

{a\cj) + Tj ||a'|| < bf, ij =1,3, (4)

(7- + Tj f < ||c'" - C-' f , i * j, i, j = u, (5)

r >0, / = U (6)

Condition (4) describe that the circles belong to a triangle set, condition (5) are for non-overlapping circles, and inequalities (6) are non-negative radii condition.

3. Malfatti's 7/ -circle problem

Let c' (c\, c'2 ), 7 = 1,// - coordinates of // circle 7*, 7 = 1,7? —their corresponding radii and x = (c1,..., c", rx,..., rn ) g R3".

n

Q(x) = 7rYjr-îmax, (7)

^ _

<a',cÔ + /-|a'|<é,., i = 1,3, 7 = 1,« (8)

(^+r7.)2<||c! -cy|2, i, 7 =1,71, (9)

r.>0, 7 = M. (10)

Now, denote by /0 (x) the following function:

/o (x)=-e(x)=-<£r;

j

and by j] (x), 7 = 1,7?, non-convex constraints:

,/i O) = (>;■ + Tj f - \\c' - CJ ||2 < 0, i* 7, 7, j = 1, 77. Further, let us put convex constraints (8) in the set S :

S = {xsR3"\(ai,cj) + rj\\a,\\<bi, r, > 0, iJ = M}. Then the Malfatti's problem (7)-(10) has the following formulation

n

/0(x) = -nY,j I min,x g S,

J

j\ O) = (>', + >', Y - Ik - Cj f < 0, 7 7, 7, j = 1, 77.

(ID

It is a general D.C. minimization problem with inequality constraints: /0 M = So (*) " K (x) ^ min;x G s,

\fi(x) = gi(x)-hI(x)<0,iGl = {l,...,n},

where

g0(x) = 0, h0(x) = nYr], g,.(x) = (/; + r)\

j

/?;.(x) = \\c' -c'l , /' ^ j, 1 E /

and convex set S = {x e R3" \ (a1, cJ) + r || < , r > 0, /, j = 1, //{.

For this type of problem there is a special local search method provided by A.S. Strekalovsky [24].

4. A special local search method for the general D.C. minimazion problem

Consider next problem:

(VI) f(x) = g(x)-h(x)l min xgD, (12)

or

(V2) J

V(x) = g(x)-/;(x)<0, where a set 1) e i?" and the functions g, h : R" —> i? u {j a convex.

These D.C. problems were studied and built local search algorithm for their minima in [24]. Now on the basis of these results, we will consider the general D.C. optimization problem of the following type:

[ /0(*) = g0(*)-A0(*)^min,*eS, CP) x ( (14)

U(*) = sX^-h^x) <0, i g I = {1,2,3,...,»}

where functions g, and /?;.,/e/n0, are convex, as well as the set S cz 1(". Further, let us suppose that the feasable set D of Problem (V) is non empty:

D = {xGS\fi(x)<O,iGl}*0 (15)

and the optimal value of the Problem (V) is finite:

Y("P) = inf |/0 (x) | x g Z)} > -co. (16)

Furthermore, assume that a feasible starting point x° gD is given and, in addition, after several iteration it has been produced a current iterate x'1 g D,k gZ+ = {0,1,2,...}. Then consider the linearized problem as follows:

O0i(x) = g0(x)-(/?0(xi),x>lmin,XG5, (VCk) » (17)

{®lk(x), = gi(x)-(h;(xk),x-xk)-h,(xk) < 0, i g /,

where /?'(x'') is a subgradient of the function /? (■) at the point xA, h'(xk) g dk (xk), i g /.

It can be readily seen, that the linearized problem (VCk) is convex, since its goal function is convex as well as its feasible set

Z), = (x g S I gj(x)-</?.'),x-xk)~hi(xk) <0, i g/}. (18)

Hence, Problem (VCk) can be solved by suitable convex optimization methods [5, 6] for any given accuracy.

Therefore, let us compute a new iterate xk+l as an approximate solution to

the linearized problem (VCk) so that xi+1 g Dk and satisfies the following inequality:

$0, ) = )-(K(xk X xk+1) i ) + 5k , (19)

where Y(VCk) is the optimal value of the Problem (VCk) :

Y, = Y(VCk) = inf{G>ofc(x) | x g S,O Jx) < 0,i g /} (20)

and given sequence {dk | satisfies the condition

oo

2X_< + go. (21)

k=0

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

It is easy to see, that I)k cz I) , and therefore xi+1 is feasible not only in the linearized problem (VCk), but also in the original problem (V), because due to convexity of /?( )onehas

0 > gXxk+i)-{hXxk\xk+l-xk)-hXxk) = <t>(A(xtn)>

>g[(xk'])-/r(xk']) = f,(xk"). Hence, the natural idea arises to construct a sequence jx'11,

xk gD, k = 0,1,2,..., starting at the point x° and by the consecutive solving of the linearized problem (VCk). The first properties of such a sequence are similar to one of from [24].

Theorem 2. [24] The sequence xk produced by the rule (19) fulfils the following conditions:

(i) jxl]cA xk g Dk, k = 0,1,2,...;

(ii) the number sequences fok= ./0(x'') and AOo;., where ^ converge, so that

76

a)lim/0(t=/0>Y(P);

£—>■00

b) limAO0, =0; (23)

c) lim[Y(p/;f)-AO0,(xi+1)^

Proof. Proof can be performed so as it lies been done in Theorem 1 [24]. It suffices to replace in the inequality (13)-( 15) in [24] the functions /, g, h and

by /o, g0, h0 and Ooi. respectively.

In the same manner, that it has been proved Lemmal in [24], it is easy to show that the following result takes place.

Lemma 1. [24] Suppose, that the sequence jx'1 j and jyl j, where

yk=K(xk)cdh,(xk), £ = 0,1,2,... produced by the method (19) converge in the following sense

W)

lim x = x,

£—>•00

lim yk0 =y0Gdh0(x).

£—>oo

(24)

Then the number sequence {Tk = Y('P£, )} converges so that

limY, =Y* (25)

as well as the sequence jO0/f(xi+1)j:

lim00,(O = 00 . (26)

From (23) (c) it follows that Y* = O0, what can be expressed otherwise and more precisely by means of analogy of Theorem2 of [24] for problem (V).

Theorem 3. [24] Assume, that in addition to (7i) the supplementary assumption holds

(HI) limy. = yi g dht(x*), i g /, (27)

where yk = h\{xk) g dht(xk), k = 0,1,2,..., /' g /.Then the cluster point x* of the sequence |x/l j turns out to be a solution to the following linearized problem:

f O0t(x) = g0(x)-(y0,x)imm,xeS, (VC) * (28)

i1®0* (*) = gi (*) -{y^x-x,)- h, (x*) < 0,

where yit = h](xt) g air (x*), i g / u {0}.

Proof. From (3)-(6), (19), (25) and (26) it follows

= = > (29)

due to continuity of g*(-) and the inner product. On the other hand, on account of the inequalities

^ ®Ok(x) = go(x*)-(y^x*)i VXGS, ® *(*) = g,(X)-(y,k,x-xk)~h,(xk), V/G/ (i.e. Vx e Dk), k = 0,1,2,..., when k —> go we obtain following relations: Y* < O0„(x) = g0(xH,)-(_y0,x), Vxg S, ®it(x) = gi(x)-(yi,x-xt)-hi(xt\ Vie/, > (30)

Vx g A = {x g S I ®,.*(x) < 0, i g /}

due the continuity of the inner product and the functions /?, ( ), i g I.

The latter system of inequalities (30) along with (29) proves the theorem. Corollary 1. The cluster point x* g ^of the sequence |x'l| is a stationary

(critical) point of the problem (V), so that it fulfils the necessary optimality conditions

(«) ¿Л Ы(х*)-УА g ~N(xt I S),

i= 0

(0)£А,Ф,*(**) = О

;=0

(31)

with some multipliers of Lagrange

A,>0, / = 0,1,2,--, A = {(A0,A1,A2,---,AJ}^0„+1.

Remark (stopping criteria). As it has been shown in [24], it can be ready seen, that the inequatliites

(32)

or

Ф0,(х*)-Ф0Д-О =

= g0(xk)-g0(xk*)(h'0(xk\xk-xk+l)<^, 5<1 (33)

can be take as the stopping criterion for the method (19).

On the other hand, it is easy to show the result similar to Proposition 1 in [24] or, what is the same, convergence with respect to the variable x :

lim II xk -xk+11|= 0

fc—» со

(34)

under the assumption that the function /?0( ) is strongly convex, i.e.

K (*) ^ K (y)+<K (y),x-y)+Y Ik - >'Î > Vx' y G R ■

as it was performed in the proof of Proposition 1 in [24].

5. Test problems

In order to obtain numerical solution of Malfatti's problem we use the special local search method provided by A.S. Strekalovsky [24], whose main idea consisted of a successive solution of partially linearized problems.

Let us introduce the test problem 1. This is origin Malfatti's problem with 3 circle, which we should place in a triangle with given vertices A(0,0), B(3,4), C(8,6). Then global optimization formulation is [9]:

max / = n(x] + x26 + x2 ), -4x, + 3x2 + 5x3 < 0, 6x, -8x2 + 10x3 <0,

-2Xj + 5X2 + V29x3 <14, -4x4 + 3x5 + 5x6 < 0, 6x4 - 8x5 + 10x6 < 0,

-2x4 + 5x5 + V29x6 < 14, -4x7 + 3x8 + 5x9 < 0, 6x7 -8xs +10x9 < 0, -2x7 + 5x8 + \fl9x9 <14, (x4 - Xj )2 + (x5 - x2 )2 - (x3 + x6 )2 > 0, (x7 -xt)2 +(x8 -x2)2 -(x3 +x9)2 >0, (x7-X4)2+(X8-X5)2-(X6+X9)2 >0, x3 > 0, x6 > 0, x9 > 0.

where x3 = i], x6 = r2, x9 = /;. Then D.C. formulation will be:

/0(x) = -^(x2+x2+x2)-

•mm,

x g S.

where

S =

-4x{ + 3x2 + 5x3 < 0,6Xj - 8x2 + 10x3 < 0, -2Xj + 5x2 + V29x3 < 14, - 4x4 + 3x5 + 5x6 < 0, 6x4 - 8x5 +10x6 < 0, - 2x4 + 5x5 + yfl9x6 < 14, -4x7 + 3x8 + 5x9 < 0,6x7 - 8x8 + 10x9 < 0,

(35)

(36)

(37)

(38)

-2x7 + 5x8 + yfl9x9 <14. Local search for this problem run from 8 following starting points

x10=(2.5, 2.5, 3.5, 3.5, 4.7, 4.1, 0.5, 0.6, 0.3); Xq =(1.0, 1.0, 2.0, 2.0, 7.4, 5.4, 0.2, 0.2, 0.2); Xg = (3.0, 3.0, 4.0, 4.0, 2.0, 2.0, 0.1, 0.1 0.1); Xg = (2.0, 2.0, 3.5, 3.5, 4.5, 4, 0.2, 0.4, 0.1); Xg =(3.5, 3.5, 2.3, 2.3, 6.0, 4.8, 0.6, 0.4, 0.08); x06 =(1.5, 1.5, 3.5, 3.5, 5.5, 4.5, 0.2, 0.4, 0.2); Xg =(3.5, 3.0, 4.5, 4.0, 6.0, 5.0, 0.1, 0.3, 0.1); Xg8 =(2.4, 2.4, 3.5, 3.5, 4.7, 4.1, 0.4, 0.6, 0.4). In following table one can see the results of numerical experiment. Where x0 is the starting point number, f0(x0) is value of cost function in starting

point, f0 (z) is value of cost function in critical point z , PL is the number of

solved linearized problem.

Note that the solution of test problem 1 performed by greedy algorithm is equal 3.194.

Xg /oW m PL Time (sec.)

1 2.1991 3.1944 4 0.67

2 0.3770 3.1944 6 0.94

3 0.0942 3.1944 4 0.84

4 0.6597 3.1944 5 0.88

5 1.6067 3.1944 5 1.05

6 0.7540 3.1944 5 0.78

7 0.3456 3.1944 4 0.64

8 2.2101 3.1944 4 0.60

Software: Matlab R2011b, Gurobi Optimizer 7.5.1. Computer: CPU Intel Core i5-5200U CPU 2.20 GHz 2.20 GHz, 6 GB RAM.

The test problem 2 is Malfatti's problem with 4 circle [9], which we should place in the same triangle with given vertices A(0,0), B(3,4), C(8,6). For this problem Local search run from 10 following starting points.

xj =(2.5, 2.5, 3.5, 3.5, 4.7, 4.1, 0.7, 0.7, 0.5, 0.6, 0.3, 0.04);

x^ =(2.0, 3.0, 3.0, 3.0, 4.0, 4.0, 5.5, 4.5, 0.1, 0.4, 0.2, 0.1); x0' =(1.0, 1.0, 2.0, 2.0, 3.0, 3.0, 7.4, 5.4, 0.2, 0.2, 0.3, 0.2); x04 =(3.5, 3.5, 2.3, 2.3, 6.0, 4.8, 0.7, 0.7, 0.6,0.3, 0.08, 0.04); Xg =(4.5, 4.0, 1.9, 1.9, 2.5, 2.5, 3.4, 3.4, 0.5, 0.3, 0.5, 0.6); x06 =(1.5, 1.5, 2.0, 2.0, 3.5, 4.0, 5.5, 7.0, 0.1, 0.1, 0.1, 0.1); x07 =(2.0, 3.0, 3.0, 3.0, 4.0, 4.0, 5.5, 4.5, 0.1, 0.4, 0.2, 0.1);

E. Rentsen, M. V. Barkova, B. Sukhee. D.C. programming approach to Malfatti's problem

x08 =(0.4, 0.4, 1.0, 1.0, 2.1, 2.1, 3.8, 3.6, 0.02, 0.05, 0.2, 0.5); x09 = (1.3, 1.2, 3.5, 3.4, 1.7, 1.7, 0.3, 0.3, 0.1, 0.6, 0.2, 0.02); xj° = (1.0, 1.0, 3.5, 3.5, 5.0, 6.1, 5.5, 7.0, 0.1, 0.3, 0.1, 0.1).

Let consider the computational results of local search for test problem 2. Solution performed by greedy algorithm is 3.7103.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

x0 /oOo) /oO) PL Time (sec.)

1 2.2041 3.6687 5 0.76

2 0.6911 3.7103 5 0.75

3 0.6597 3.6687 6 1.10

4 1.4388 3.6687 5 0.96

5 2.9844 3.6687 3 0.53

6 0.1257 3.6687 5 1.19

7 1.6022 3.6687 4 0.92

8 0.9161 3.7103 5 0.8

9 0.9201 3.6687 5 0.9

10 0.3770 3.7103 6 1.11

Conclusion

In this paper Malfatti's problem has been considered which is a nonconvex optimization problem. This problem was reformulated by us as a D.C. programming problem with D.C. constraint. Based on a local search method, an attempt to find global solutions in this problem has been made. In the proposed algorithm, initial starting points are chosen arbitrarily.

For comparison purpose, we have considered some test examples given in [8]. The numerical results are provided and in all cases global solutions have been found in these problems.

References

1. Andreatta M., Bezdek A. and Boroaski Jan P. The Problem of Malfatti: Two Centuries of Debate. The Mathematical Intelligencer. 2011. No. 33. Pp. 72-76.

2. Andrei N. Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization. JOTA. 2009. No. 141. Pp. 249-264.

3. Anikin A., Gornov A., and Andrianov A. Computational Technologies for Morse Potential Optimization. Optimization and Applications (OPTIMA-2013): Abstracts of IVInternetional Conference. 2013. Pp. 22-23.

4. David J. W. and Doye J. P. K. Global optimization by basin-hopping and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms. The Journal of Physical Chemistry A. 1997. V. 28. No. 101. Pp. 5111 -5116.

5. Nocedai J., Wright St. J. Numerical Optimization. New York: Springer, 2006. 685 p.

6. Vasiliev F. P. Metody optimizatsii [Optimization Methods]. Moscow: Factorial Press, 2002. 824 p.

7. Enkhbat R. An Algorithm for Maximizing a Convex Function over a Simple Set. Journal of Global Optimization. 1996. No. 8. Pp. 379-391.

8. Enkhbat R. Global Optimization Approach to Malfatti's Problem. Journal of Global Optimization. 2016. No. 65. Pp. 33-39.

9. Enkhbat R„ Barkova M.V. and Strekalovsky M.V. Solving Malfatti's High Dimensional Problem by Global Optimization. Numerical Algebra, Control and Optimization. 2016. V. 6. No. 2. Pp. 153-160.

10. Fedorenko R. P. Priblizhennoye resheniye nekotorykh zadach optimal'nogo upravleniya [Approximate Solution of Some Optimal Control Problems]. Zhurnal vy-chislitel'noi matematiki i matematicheskoi fiziki — USSR Computational Mathematics and Mathematical Physics. 1964. V. 4. N. 6. P. 153-160.

11. Gabai H. and Liban E. On Goldberg's Inequality Associated with the Malfatti Problem. 1967. V. 41. No. 5. Pp. 251-252.

12. Gernet N. Ob osnovnoi prosteishei zadache variatsionnogo ischisleniya [The Fundamental Problem of the Calculus of Variations]. St. Petersburg: Erlich, 1913.

13. Goldberg M. On the Original Malfatti Problem. Math. Mag. 1967. V. 40. No. 5. Pp. 241-247.

14. Lob H. and Richmond H. W. On the Solutions of the Malfatti Problem for a Triangle. London Math. Soc. 1930. V. 2. No. 30. Pp. 287-301.

15. Los G. A. Malfatti's Optimization Problem. Dep. Ukr. NIINTI. 1998. [in Rus.]

16. Malfatti G. Memoria sopra una problema stereotomico. Memoria di Matematica e di Fisica della Societa italiana della Scienze. 1803. V. 10. No. 1. Pp. 235 -244.

17. Melissen H. Packing and Covering with Circles. Ph. D. Thesis. Univ. of Utrecht, 1997.

18. Polyak B. T. Metod sopryazhennykh gradiyentov v ekstremalnykh zadachakh [The Conjugate Gradient Method in Extremal Problems]. Zhurnal vychislitel'noi matematiki i matematicheskoi fiziki — USSR Computational Mathematics and Mathematical Physics. 1969. V. 9. No. 4. Pp. 94-112.

19. Pervin M., Roy S. K. and Weber G. W. A Two-Echelon Inventory Model with Stock-Dependent Demand and Variable Holding Cost for Deteriorating Items. Numerical Algebra, Control and Optimization. 2016. V. 7. No. 1. Pp. 21-50.

20. Pervin M., Roy S. K., and Weber G. W. Analysis of Inventory Control Model with Shortage under Time-Dependent Demand and Time-Varying Holding Cost Including Stochastic Deterioration. Annals of Operations Research. 2016. DOI: 10.1007/s 10479-016-2355-5.

21. Roy S. K„ Maity G„ Weber G. W„ and Alparslan Gok S. Z. Conic Scalariza-tion Approach to Solve Multi-Choice Multi-Objective Transportation Problem with Interval Goal. Annals of Operations Research. 2016. DOI 10.1007/sl0479-016-2283-4.

22. Roy S. K„ Maity G„ and Weber G. W. Multi-Objective Two-Stage Grey Transportation Problem Using Utility Function with Goals. Central European Journal of Operations Research. 2016. V. 7. No. 1. Pp. 21-50.

23. Strekalovsky A. S. K probleme globalnogo ekstremuma [On the Global Ex-trema Problem], Doklady Akademii nauk SSSR — Proc. of the USSR Academy of Sciences. 1987. V. 295. No. 5. Pp. 1062-1066.

24. Strekalovsky A. S. On Local Search in D.C. Optimization Problem. Applied Mathematics and Computation. 2015. V. 255. P. 73-83.

25. Teo K. L., Goh C. J., and Wong K. H. A Unified Computational Approach to Optimal Control Problems. Pitman Monographs and Surveys in Pure and Applied Mathematics. New York, Longman Scientific & Technical, 1991.

26. Valentine F. A. The Problem of Lagrange with Differential Inequalities as Added Side Conditions. Dissertation Univ. of Chicago, 1937.

27. Zalgaller V. A. and Los G. A. The Solution of Malfatti's Problem. Journal of Mathematical Sciences. 1994. V. 72. No. 4. Pp. 3163-3177.

D.C. ПОДХОД К РЕШЕНИЮ ЗАДАЧИ МАЛЬФАТТИ

О Энхбат Рэнцэн

профессор, доктор наук, Национальный Университет Монголии Монголия, 210646, Улан-Батор, ул. Университетская-1 E-mail: cnkli b a t4 6(й у а lioo.com

© Баркова Мария Владимировна

научный сотрудник,

Институт динамики систем теории управления им. В. М. Матросова СО РАН Россия, 664033, Иркутск, ул. Лермонтова, 134 E-mail: mariabarkovamail(S)gmail.com

© Батбилег Сухээ

научный сотрудник,

Национальный Университет Монголии

Монголия, 210646, Улан-Батор, ул. Университетская-1

E-mail: batbileg@seas.num.edu.mn

В предыдущих работах Р. Энхбат показал, что проблему Малфатти можно рассматривать как проблему выпуклой максимизации и решать алгоритмом на основе глобальных условий оптимальности А. С. Стрекаловского. В этой статье мы переформулируем проблему Малфатти как проблему D.C. программирования с невыпуклым ограничением. Приведенная проблема, как проблема оптимизации с D.C. ограничениями, принадлежит классу глобальной оптимизации. Мы применяем локальные и глобальные условия оптимальности А. С. Стрекаловского, разработанные для D.C. программирования. Основываясь на методах локального поиска для D.C. программирования, мы разработали алгоритм для численного решения задачи Малфатти. В численных экспериментах исходные точки предлагаемого алгоритма выбираются случайным образом. Во всех случаях найдены глобальные решения.

Ключевые слова: D.C. оптимизация; условия глобальной оптимальности; задача Мальфатти; выпуклая максимизация; алгоритм локального поиска; D.C. ограничение; глобальная оптимизация; круги Малфатти; линеаризованная задача; D.C. минимизация.

i Надоели баннеры? Вы всегда можете отключить рекламу.