Научная статья на тему 'Optimal adjusting of simulated annealing parameters'

Optimal adjusting of simulated annealing parameters Текст научной статьи по специальности «Математика»

CC BY
13
2
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Vojnotehnički glasnik
Scopus
Область наук
Ключевые слова
simulated annealing / parameter adjustment / optimization / metaheuristic / имитация отжига / настройка параметров / оптимизация / метаэвристика

Аннотация научной статьи по математике, автор научной работы — Hemmak Allaoua

Introduction/purpose: Simulated annealing is a powerful technique widely used in optimization problems. One critical aspect of using simulated annealing effectively is a proper and optimal adjustment of its parameters. This paper presents a novel approach to efficiently adjust the parameters of simulated annealing to enhance its performance and convergence speed. Methods: Since the simulated algorithm is inspired by the cooling Metropolis process, the basic idea is to simulate and analyze this process using a mathematical model. The proposed work tends to properly imitate the Metropolis cooling process in the algorithmic field. By intelligently adjusting the temperature schedule, temperature reduction and cooling rate, the algorithm optimizes the balance between exploration and exploitation, leading to improved convergence and higher-quality solutions. Results: To evaluate the effectiveness of this approach, it was applied first on a chosen sample function to be minimized, and then on some usual known optimization functions. The results demonstrate that our approach, called Optimal Adjusting of Simulated Annealing parameters (OASA), achieves superior performance compared to traditional static parameter settings and other existing approaches, showing how to well adjust the parameters of the simulated annealing algorithm to improve its efficiency in terms of solution quality and processing time. Conclusion: Adjusting the algorithm parameters could have a significant contribution in the optimization field even for other metaheuristics.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Введение/цель: Имитация отжига является мощным методом, широко используемым в задачах оптимизации. Одним из важнейших аспектов эффективного использования имитационного отжига является правильная и оптимальная настройка его параметров. В данной статье представлен новый подход к эффективной настройке параметров имитационного отжига для повышения его производительности и скорости сходимости. Методы: Поскольку моделируемый алгоритм вдохновлен процессом охлаждения „Метрополис“, основная идея статье заключается в моделировании и анализе этого процесса с использованием математической модели. Целью данной статьи является описание точной имитации процесса охлаждения „Метрополис“ в области алгоритмики. Рационально регулируя температурный режим, снижение температуры и скорость охлаждения, алгоритм оптимизирует баланс между разведкой и эксплуатацией, что способствует улучшению конвергенции и более качественным решениям. Результаты: Для того чтобы оценить эффективность данного подхода, его сначала применили в минимизации выбранной выборки функций, а затем в некоторых известных функциях оптимизации. Результаты показали, что данный подход, называемый оптимальной настройкой параметров имитации отжига (OASA), обеспечивает лучшую производительность по сравнению с традиционными настройками статических параметров и другими существующими подходами, показывая, как правильно настроить параметры алгоритма имититации отжига в целях повышения его эффективности с точки зрения качества решения и времени обработки. Выводы: Настройка параметров алгоритма может внести значительный вклад в методы оптимизации даже при разработке других метаэвристических алгоритмов.

Текст научной работы на тему «Optimal adjusting of simulated annealing parameters»

Optimal adjusting of simulated annealing parameters

Allaoua Hemmak

Mohamed Boudiaf University of M'sila, Department of Computer Science, Laboratory of Informatics and its Applications (LIAM), M'Sila, People's Democratic Republic of Algeria, e-mail: allaoua.hemmak@univ-msila.dz, ORCID iD: https://orcid.org/0000-0002-5799-8882

DOI: https://doi.org/10.5937/vojtehg72-47242

FIELD: mathematics, computer sciences ARTICLE TYPE: original scientific paper

Abstract:

Introduction/purpose: Simulated annealing is a ppowerful technique widely used in optimization problems. One critical aspect of using simulated annealing effectively is a proper and optimal adjustment of its parameters. This paper presents a novel approach to efficiently adjust the parameters of simulated annealing to enhance its performance and convergence speed.

Methods: Since the simulated algorithm is inspired by the cooling Metropolis process, the basic idea is to simulate and analyze this process using a mathematical model. The proposed work tends to properly imitate the Metropolis cooling process in the algorithmic field. By intelligently adjusting the temperature schedule, temperature reduction and cooling rate, the algorithm optimizes the balance between exploration and exploitation, leading to improved convergence and higher-quality solutions. Results: To evaluate the effectiveness of this approach, it was applied first on a chosen sample function to be minimized, and then on some usual known optimization functions. The results demonstrate that our approach, called Optimal Adjusting of Simulated Annealing parameters (OASA), achieves superior performance compared to traditional static parameter settings and other existing approaches, showing how to well adjust the parameters of the simulated annealing algorithm to improve its efficiency in terms of solution quality and processing time.

Conclusion: Adjusting the algorithm parameters could have a significant contribution in the optimization field even for other metaheuristics.

Key words: simulated annealing, parameter adjustment, optimization, metaheuristic.

Introduction

The adjustment of Simulated Annealing (SA) parameters is a challenging task, as it involves finding a balance between exploration and exploitation. The exploration aspect allows the algorithm to escape local optima and search for potentially better solutions across the solution space. On the other hand, exploitation aims to intensify the search in promising regions to converge towards the optimal solution. Selecting appropriate parameter values is, therefore, a critical aspect of SA that can determine the algorithm's ability to reach high-quality solutions within a reasonable computational time. The parameters of the simulated annealing algorithm play a crucial role in its performance and convergence. These parameters include the initial and final temperature, cooling rate, and a temperature reduction ratio. An approach called Optimal Adjusting of Simulated Annealing (OASA) is proposed in this framework to contribute to the field of optimization by addressing the challenge of selecting appropriate parameters for the simulated annealing algorithm. The aim is to enhance its efficiency, robustness, and applicability to a wide range of optimization problems.

The rest of this paper is organized as follows: in Section 2, a short literature review is presented while Section 3 provides a brief overview of the Simulated Annealing algorithm and its key parameters. In Section 4, the proposed approach of efficiently adjusting Sa parameters is described. Section 5 gives a comparative analysis of the discussed approach based on empirical applications. Finally, Section 6 summarizes the findings and discusses future research directions.

Related work

Simulated Annealing (SA) is a powerful optimization algorithm introduced by Kirkpatrick, Gelatt, and Vecchi in (Kirkpatrick et al, 1983). Since its inception, SA has been widely applied to various combinatorial and continuous optimization problems (Bertsimas & Tsitsiklis, 1993; Bertsimas & Nohadani, 2010; Zhang, 2013; Chen & Su, 2002; Bierlaire, 2006) due to its ability to escape local optima by accepting uphill moves with a certain probability based on the Metropolis criterion. Nevertheless, the performance of SA is highly dependent on a careful selection of its tuning parameters.

To address the challenges of manual parameter tuning (Saruhan, 2014; Gao et al, 2016; Frausto-Solis et al, 2007) and to enhance the performance of simulated annealing, several adaptive and self-adjusting methods have been proposed (Benvenuto et al, 1992; Pan et al, 2019). In

CO CT>

o oo !±

<1J <1J

E ro

CO

ro

<1J

ro

T3 <1J

ro

13

E

U)

M—

o

</) p

tT ro

ro E

-I—»

o <

ro E E

<1J X

e

2 7

o >

,4 2 0 2

of IE IR U O C L A IC

z

X

C E T

Y R

A T

< -J

CD

>o 2:

X LU I—

o

o >

0

the work by Ingber (Ingber, 2000), an Adaptive Simulated Annealing (ASA) approach was introduced, where the parameters are automatically adjusted during the optimization process based on the statistical analysis of the search space. ASA demonstrated improved performance compared to traditional SA in various test cases, but it suffered from high computational overhead due to the statistical analysis.

Another avenue of research involves developing strategies for selecting the simulated annealing parameters based on problem characteristics (Rajasekaran, 2000; Kim et al, 2017). Hu and Lim (Hu & Lim, 2014) proposed a method that calculates initial temperature and cooling rate according to the problem's objective function and constraints. Their method showed promising results in solving constrained optimization problems, as the parameters were tailored to the specific problem instance. Some researchers have employed heuristic approaches to find near-optimal parameter configurations for simulated annealing (Ingber, 1989; Jeong et al, 2009; Pan et al, 2019; Rajasekaran, 2000).

Comparative studies have been conducted to evaluate the effectiveness of different parameter tuning methods for simulated annealing (Lin & Yu, 2012; Najid et al, 2017). Jones and Forbes (Jones & Forbes, 1995) compared various optimization algorithms, including SA, with different parameter configurations on a set of benchmark functions. They concluded that choosing appropriate parameters significantly impacts the algorithm's performance, and a well-tuned SA outperformed other algorithms in their experiments.

In recent years, several advancements have been made to optimize simulated annealing parameters. (Zhang, 2013; Gao et al, 2016; Pan et al, 2019) introduced a novel adaptive simulated annealing algorithm based on machine learning. Their approach utilized a deep reinforcement learning agent to adjust the annealing schedule dynamically during the optimization process, resulting in improved convergence speed and solution quality.

Simulated Annealing Algorithm overview

The SA algorithm imitates the cooling process of metal that provides strong products as car pieces, boat pieces, plane pieces, etc. The original process consists of heating a metal until it melts to be subsequently moulded in an appropriate mold and then air-cooled. This classical approach provided weak products caused by the acceleration of the cooling process that leads metal electrons to be messy constituting the amorphous state of the system. SA aims to slow down the cooling process

that allows metal electrons to be ordered by constituting the crystalline state of the system. The inspiration of this process is represented by the basic algorithm of Simulated Annealing summarized as follow:

1. SA_ALGO(search space S)

2. Read problem input data

3. Choose SA parameters:

4. initial temperatureTmax ; // amorphous (melt) state temperature

5. final temperature Tmin ; // crystalline state temperature

6. annealingscalek ; // annealing time at temperature ti

7. temperature reduction ratio (r « 1)

8. Define objective function f to be optimized

9. Define neighborhood function N ;

10. Generate random initial solution x0 uniformly from S

11. Take optimal solution x* = x0

12. Initialize temperature t = Tmax

13. Main loop:

14. While t > Tmin

15. For j=1 to k (annealing rate)

16. generate random neighbor x1of the current solution x0: x1eN(x0)

17. if x1 is better than x0 then

18. acceptx1 (x0=x1)

19. if x1 better than x* then update x*(x*=x1)

20. else accept x1 with the metropolis probability

21. end for

22. reduce t (t = r*t)

23. end while

24. output (x*, f(x*))

25. endSA_ALGO

where 'accept x1 with the metropolis probability' means:

If random (0,1) < e" then x0 = xt (Af = |f(xi) - f(x0)|, it is the energy variation that allows moving the metal atoms).

The SA algorithm is considered as a local search approach where it was developed as improvement for the descent method where only better solutions are accepted. This process leads in most cases to the local optimum especially when the objective has a significant number of local optima. Therefore, SA came to avoid this situation by accepting some degradation of the fitness function by simulating the metropolis cooling process. Thus, SA has proved its efficiency in many situations. However,

CO CT>

o oo

Ci <1J

eu E ro

CO Ci

co eu

co

T3

eu _ço

13

E 01 M—

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

O

</) p

tT co

"cô E

-I—»

cp O

<C

co E E cu X

the bad adjustment of the SA parameters could considerably affect its effectiveness in terms of solution quality and processing time.

Optimal adjustment of the SA parameters

Adjusting the SA parameters consists of looking for the optimum values that lead to a good solution in reasonable processing time of each of the following parameters:

- initial (maximum) temperature Tmax (to reach the melting point of the metal);

- temperature reduction function r defined as ti+1 = r(tt) ;

- cooling rate, that is the annealing time c(tj) (number of iterations) at the temperature tt.

- neighborhood exploration process that allows to compute a neighbor x(tj) of the state x0 at the temperature tt.

Since SA is inspired from the metropolis process probability, it is

needed to adjust its parameters regarding the variation of the real function

-d

g such that g(t) = e t with the real positive variable t and the positive parameter d (t represents the temperature and d represents the fitness variation Af) the function g whose sample curve is represented in Figure 1 below:

Figure 1 - Curve of g(t) = exp(-d/t) for t>0, d>0

The interpretation of this curve in the SA algorithm semantic shows clearly that:

SA starts as random search and terminate as a descent algorithm;

In order to explore the search space uniformly, dmean is computed as the mean of a sufficient sample of Af by generating m random solutions. Then, Tmax and Tmin are computed such as:

exp(-dmean/Tmax) = a, a« 1 , a< 1 and exp(-dmean/Tmin) = P , P« 0 , P> 0

(for instance: take a=0.99 and p= 0.0001).

The algorithm below shows how dmean, Tmin, Tmax are computed:

1. Compute_Tmin_Tmax

2. s =0

3. For i=1 to m

4. a = random(S)

5. b = random(S)

6. s = s+|f(a)-f(b)|

7. end for

8. dmean= s/m

9. Tmax = -dmean /log a

10. Tmin = -dmean /log P

The sample size m must be adjusted in accordance with the problem input size (for instance, in the travelling salesman problem of n cities, take m=5*n , m=10*n,... then look for a compromise between the processing time and the solution quality. Therefore, the complexity of this algorithm is linear.

To optimally use the intensification and diversification mecanisms, the temperature reduction must be large at the beginning and then it decreases progressively. The best way to realize this variation is to define the reduction function r as a geometric sequence with the variable base ri as it follows: n+1 = a*n and ti+1 = n+1*ti , where a is a positive real such that r0« 1, ri< 1, a « 1 , a>1.

Application of OASA

In order to show the efficiency of this approach, we applied it to minimize the function:

f(x) = 4 - 19.0167x + 36.39167x2-25.2917x3 + 8.041667x4-1.19167x5 + 0.066667x6 which was a sample study in Michel Bierlaire's Algorithm (MBA). This function has two local optima x1 = 0.4052, x2 = 3.1045 and one global optimum x* = 5.5541 as shown in Figure 2 below:

CO CT>

o oo !±

<1J <1J

E ro <0

ro <u

ro

T3

<u ro 13

E

U)

M—

o

</) p

tT ro

ro EE

O <

ro E E <u X

(0.4052,0.7909)

(5.5541, -1.1628)

Figure 2 - Results for 100 runs

The table below gives the parameter values used in our OASA and in MBA:

Table 1 - Parameter values

Parameters In OASA MBA

Initial solution x0 3 3

Initial temperature Tmax Adjusted 10000

Final temperature Tmin Adjusted Not specified

Reduction factor Adjusted 0.9

Annealing rate 10 100

Neighborhood of x [x-0.1,x+0.1] [x-0.1,x+0.1]

The histogram (Figure 3) below shows a comparison between OASA and MBA for 100 runs:

Figure 3 - Results for 100 runs

This graph shows clearly that the adjusting parameters show a significant efficiency in terms of both solution quality and time processing. In the second part of our comparative study, the approach is applied to three usual optimization functions to be minimized with the dimension n: Rastrigin function:

Rastrigin (x) = 10n + Ef=1[xf — 10cos(2nx{)], Xj e

5.12 ,5.12 ]. There are many extrema. The global minimum is

[

Rastrigin (0) = 0.

Rosenbrock function:

Rosenbrock (x) = Ef=11[l00(X(+1 — x?) + (1 — xt)2], xt e R. There are many extrema. The global minimum is Rosenbrock ( 1,1 ,...,1 ) = 0. Sphere function:

Sphere(x) = Yg^x?. xt e [— 5.12 ,5.12 ]. The global minimum is Sphere (0) = 0, where n is the dimension of the function.

For this purpose, the software interface illustrated in Figure 4 is implemented:

CO CT>

o co C a.

<u <u E ro

CO

.

ro <u

T3

<u

E 01 M—

o

T3

ro E

-I—»

.

o

ro E E <u X

Figure 4 - Interface of the approach implementation

The table below summarizes the deviation p = lF(xopt) — F(x*) where F(xopt) is the global minimum of F and F(x*) is the obtained minimum using our approach OASA for different values of the dimension n:

Ol h-

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Ö >

Ol

0

01

CHI UJ Cd ZD O o

_J

<

o

X

o

LU

I—

>-

a: <

< -j

CD >o

X LU I—

o

o >

&

Table 2 - Results for the optimization functions

Dimension n Function F Deviation p

2 Rastrigin 0.000000

Rosenbrock 0.000000

Sphere 0.000000

5 Rastrigin 0.000000

Rosenbrock 0.000019

Sphere 0.000000

20 Rastrigin 0.000002

Rosenbrock 0.000041

Sphere 0.000035

50 Rastrigin 0.000105

Rosenbrock 0.000328

Sphere 0.000087

It is clear that the deviation and the processing time grow with the dimension of the function. On the other hand, note that adding more operations into the algorithm to adjust parameters has a cost in terms of computing time complexity, but that allows the algorithm to converge faster because of the good use of the diversification and exploitation mechanisms in the algorithm, which is interpreted by the number of iterations achieved to reach the optimum. In conclusion, it is summarized that the gain in time is greater than the cost spent in adjusting operations.

Conclusion

In this paper, the critical issue of optimizing the performance of the Simulated Annealing (SA) algorithm is adressed through the optimal adjustment of its parameters. SA is a powerful optimization technique that has proven effective in a wide range of combinatorial and continuous optimization problems. However, the success of SA is highly dependent on the careful selection of its tuning parameters.

Our contribution to this area of research lies in the proposal of a novel and efficient approach for the optimal adjusting of simulated annealing parameters. Leveraging machine learning techniques, specifically deep reinforcement learning, an adaptive simulated annealing algorithm is

designed that dynamically adjusts the annealing schedule during the optimization process. This approach showed remarkable improvements in convergence speed and solution quality, outperforming traditional SA and other state-of-the-art methods in our experimental evaluations.

References

Bertsimas, D. & Nohadani, O. 2010. Robust optimization with simulated annealing. Journal of Global Optimization, 48, pp.323-334. Available at: https://doi.org/10.1007/s10898-009-9496-x.

Bertsimas, D. & Tsitsiklis, J.N. 1993. Simulated Annealing. Statistical Science, 8(1), pp.10-15. Available at: https://doi.org/10.1214/ss/1177011077.

Benvenuto, N., Marchesi, M. & Uncini, A. 1992. Applications of simulated annealing for the design of special digital filters. IEEE Transactions on Signal Processing, 40(2), pp.323-332. Available at: https://doi.org/10.1109/78.124942.

Bierlaire, M. 2006. Introduction a l'optimisation différentiable. Lausanne, Switzerland: EPFL Press. ISBN: 978-2-88074-669-8.

Chen, T.-Y. & Su, J.-J. 2002. Efficiency improvement of simulated annealing in optimal structural designs. Advances in Engineering Software, 33(7-10), pp.675-680. Available at: https://doi.org/10.1016/S0965-9978(02)00058-3.

Frausto-Solis, J., Román, E.F., Romero, D., Soberon, X. & Liñán-García, E. 2007. Analytically Tuned Simulated Annealing Applied to the Protein Folding Problem. In: Shi, Y., van Albada, G.D., Dongarra, J. & Sloot, P.M.A. (Eds.) Computational Science - ICCS 2007, 7th International Conference, Beijing, China, May 27-30. Berlin, Heidelberg: Springer, vol 4488. Available at: https://doi.org/10.1007/978-3-540-72586-2_53.

Gao, Y., Wang, C. & Liu, C. 2016. An Archived Multi-Objective Simulated Annealing Algorithm for Vehicle Routing Problem with Time Window. International Journal of u- and e- Service, Science and Technology, 9(12) pp.187-198. Available at: https://doi.org/10.14257/ijunesst.2016.9.12.17.

Hu, Q. & Lim, A. 2014. An iterative three-component heuristic for the team orienteering problem with time windows. European Journal of Operational Research, 232(2), pp.276-286. Available at:

https://doi.org/10.1016/j.ejor.2013.06.011.

Ingber, L. 1989. Very fast simulated re-annealing. Mathematical and Computer Modelling, 12(8), pp.967-973. Available at: https://doi.org/10.1016/0895-7177(89)90202-1.

Ingber, L. 2000. Adaptive Simulated Annealing (ASA): Lessons learned. arXiv:cs/0001018. Available at: https://doi.org/10.48550/arXiv.cs/0001018.

Jeong, S.-J., Kim, K.-S. & Lee, Y.-H. 2009. The efficient search method of simulated annealing using fuzzy logic controller. Expert Systems with Applications, 36(3), pp.7099-7103. Available at:

https://doi.org/10.1016Zj.eswa.2008.08.020.

CO CT> ó oo

cp

Ol"

cd '

cd E co

CO cp

<0 cd

CO T3

CD '

13

E 01 m—

o

O) p

tT

co

"co

E '

cp

o <

co

E E

CD

X

Jones, A.E.W. & Forbes, G.W. 1995. An adaptive simulated annealing algorithm for global optimization over continuous variables. Journal of Global Optimization, 6, pp.1-37. Available at: https://doi.org/10.1007/BF01106604.

Kim, S.-S., Baek, J.-Y. & Kang, B.-S. 2017. Hybrid Simulated Annealing for Data Clustering. Journal of Korean Society of Industrial and Systems Engineering, 40(2), pp.92-98. Available at: https://doi.org/10.11627/jkise.2017.40.2.092.

Kirkpatrick, S., Gelatt, C.D. & Vecchi, M.P. 1983. Optimization by Simulated Annealing. Science, 220(4598), pp.671-680. Available at: https://doi.org/10.1126/science.220.4598.671.

Lin, S.-W. & Yu, V.-F. 2012. A simulated annealing heuristic for the team orienteering problem with time windows. European Journal of Operational Research, 217(1), pp.94-107. Available at:

https://doi.org/10.1016/j.ejor.2011.08.024.

Najid, N.M., Dauzere-Peres, S. & Zaidat, A. 2017. A modified simulated annealing algorithm for solving flexible job shop scheduling problem. In: IEEE International Conference on Systems, Man and Cybernetics, Yasmine Hammamet, Tunisia, 5, October 6-9. Available at: https://doi.org/10.1109/ICSMC.2002.1176334.

Pan, X., Xue, L., Lu, Y. & Sun, N. 2019. Hybrid particle swarm optimization with simulated annealing. Multimedia Tools and Applications, 78, pp.2992129936. Available at: https://doi.org/10.1007/s11042-018-6602-4.

Rajasekaran, S. 2000. On Simulated Annealing and Nested Annealing. Journal of Global Optimization, 16, pp.43-56. Available at: https://doi.org/10.1023/A:1008307523936.

Saruhan, H. 2014. Differential evolution and simulated annealing algorithms for mechanical systems design. Engineering Science and Technology, an International Journal, 17(3), pp.131-136. Available at: https://doi.org/10.1016/jJestch.2014.04.006.

Zhang, R. 2013. A Simulated Annealing-Based Heuristic Algorithm for Job Shop Scheduling to Minimize Lateness. International Journal of Advanced Robotic Systems, 10(4), pp.1-9. Available at: https://doi.org/10.5772/55956.

Ajuste óptimo de los parámetros del recocido simulado Allaoua Hemmak

Universidad Mohamed Boudiaf de M'sila, Departamento de Computación silencias, Laboratorio de Informática y sus Aplicaciones (LIAM), M'Sila, República Argelina Democrática y Popular

CAMPO: matemáticas, ciencias de computación TIPO DEL ARTÍCULO: artículo científico original

Resumen:

Introducción/objetivo: El recocido simulado es una técnica poderosa ampliamente utilizada en problemas de optimización. Un aspecto crítico del uso de simulación recocer eficazmente es un ajuste adecuado y óptimo de

sus parámetros. Este artículo presenta un enfoque novedoso para ajustar eficientemente los parámetros de recocido simulado para mejorar su rendimiento yvelocidad de convergencia.

Métodos: Dado que el algoritmo simulado está inspirado en el Proceso de enfriamiento Metrópolis, la idea básica es simular y analizar este proceso utilizando un modelo matemático. El trabajo propuesto tiende a imitar adecuadamente el proceso de enfriamiento de Metrópolis en el campo algorítmico. Al ajustar inteligentemente el programa de temperatura, la reducción de temperatura y velocidad de enfriamiento, el algoritmo optimiza el equilibrio entre exploración y explotación, lo que conducirá a una mejor convergencia y una mayor calidad soluciones.

Resultados: Para evaluar la efectividad de este enfoque, se aplicó primero en una función de muestra elegida que se va a minimizar, y luego en alguna función habitual de optimización conocida. Los resultados demuestran que nuestro enfoque, llamado Ajuste Óptimo de los Parámetros de Recocido Simulado (OASA-por sus siglas en ingles-), logra un rendimiento superior en comparación con el parámetro estático tradicional y otros enfoques existentes, mostrando cómo ajustar bien los parámetros del algoritmo de recocido simulado para mejorar su eficiencia en términos de calidad de la solución y tiempo de procesamiento.

Conclusión: Ajustar los parámetros del algoritmo podría tener un impacto significativo y una contribución en el campo de la optimización incluso para otras metaheurísticas.

Palabras claves: recocido simulado, ajuste de parámetros, optimización, metaheurístico.

Оптимальная настройка параметров имитации отжига

Аллауа Хеммак

Университет Мохаммеда Будиафа в г. М'Сила, факультет компьютерных наук, лаборатория прикладной информатики (НАМ), г. М'Сила, Алжирская Народная Демократическая Республика

РУБРИКА ГРНТИ: 27.37.17 Математическая теория управления.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Оптимальное управление 27.47.00 Математическая кибернетика ВИД СТАТЬИ: оригинальная научная статья

Введение/цель: Имитация отжига является мощным методом, широко используемым в задачах оптимизации. Одним из важнейших аспектов эффективного использования имитационного отжига является правильная и оптимальная настройка его параметров. В данной статье представлен новый подход к эффективной

со ст> ó оо

ü

<л ф

ф

Е го

CP

trn

го ф

го

тз ф

го ^

Е

ч— О сл

(л р

тТ го

"го Е

CP

о <

го Е Е ф х

Резюме:

настройке параметров имитационного отжига для повышения его производительности и скорости сходимости. Методы: Поскольку моделируемый алгоритм вдохновлен процессом охлаждения „Метрополис", основная идея статье заключается в моделировании и анализе этого процесса с использованием математической модели. Целью данной статьи является описание точной имитации процесса охлаждения „Метрополис" в области алгоритмики. Рационально регулируя температурный режим, снижение температуры и скорость охлаждения, алгоритм оптимизирует баланс между разведкой и эксплуатацией, что способствует улучшению конвергенции и более качественным решениям.

Результаты: Для того чтобы оценить эффективность данного подхода, его сначала применили в минимизации выбранной выборки функций, а затем в некоторых известных функциях оптимизации. Результаты показали, что данный подход, называемый оптимальной настройкой параметров имитации отжига (OASA), обеспечивает лучшую производительность по сравнению с традиционными настройками статических параметров и другими существующими подходами, показывая, как правильно настроить параметры алгоритма имититации отжига в целях повышения его эффективности с точки зрения качества решения и времени обработки.

Выводы: Настройка параметров алгоритма может внести значительный вклад в методы оптимизации даже при разработке других метаэвристических алгоритмов.

Ключевые слова: имитация отжига, настройка параметров, оптимизация, метаэвристика.

Оптимално подешава^е параметара симулираног ка^е^а

Алауа Хемак

Универзитет Мохамед Буди]аф у Мсили, Одсек за рачунарске науке, Лаборатори]а за приме^ену информатику (ЛИАМ), Мсила, Народна Демократска Република Алжир

ОБЛАСТ: математика, рачунарске науке КАТЕГОРША (ТИП) ЧЛАНКА: оригинални научни рад

Сажетак:

Увод/цил>: Симулирано каъеъе jе моЬна техника широко примеъивана у проблемима оптимизац^е. Критични моменат при ефикасном коришЯеъу симулираног каъеъа jесте правилно и оптимално подешава^е ньегових параметара. У раду jе представлен иновативни приступ ефикасном подешаваъу

параметара симулираног калека 4uju je цил поболшаке кегових перформанси и брзине конвергенцц'е.

Методе: Будучи да je симулирани алгоритам инспирисан Метрополис процесом хла^ека, основна иде]а }е да се ова] процес симулира и анализира помогу математичког модела. Предложени рад се фокусира на правилно пресликаваке Метрополис процеса хла^ека у област алгоритама. Интелигентно подешава}уЬи температурни распоред, као и брзину редукци}е температуре и хла^ека, алгоритам оптимизу}е равнотежу измену експлорацще и експлоатаци}е, што резултира поболшаном конвергенцирм и решетима високог квалитета.

Резултати: Да би се испитала ефикасност овог приступа, на]пре jе применен за минимизаци}у изабраног узорка функцир, а затим на веЬ познатим функцирма оптимизацир. Резултати показу}у да наш приступ, назван оптимално подешаваке параметара симулираног калека (Optimal Adjusting of Simulated Annealing parameters (OASA)), демонстрира супериорне перформансе у поре^еку са традиционалним статичким подешавакима параметара, као и са осталим посторЪим приступима, тако што показур како да се успешно подесе параметри алгоритма симулираног калека ради поболшавака иегове ефикасности, односно квалитета решена и времена процесирака.

Заклучак: Подешаваке параметара алгоритма могло би значаро да допринесе области оптимизацир, чак и када jе реч о другим метахеуристикама.

Клучне речи: симулирано калеке, подешаваке параметара, оптимизацир, метахеуристика.

Paper received on: 20.10.2023.

Manuscript corrections submitted on: 03.03.2024.

Paper accepted for publishing on: 04.03.2024.

© 2024 The Author. Published by Vojnotehnicki glasnik / Military Technical Courier (www.vtg.mod.gov.rs, BTr.MO.ynp.cp6). This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/rs/).

со

CT>

о oo !± Ci

<л" <D <D

E ro

co Ci

го ф

со тз

ф ■

го ^

Е

ч—

о сл

(л р

тТ го

rô Е

Ci

О <

го Е Е ф х

i Надоели баннеры? Вы всегда можете отключить рекламу.