Научная статья на тему 'Методы комплексирования интервальных прогнозных оценок в задачах краткосрочного прогнозирования'

Методы комплексирования интервальных прогнозных оценок в задачах краткосрочного прогнозирования Текст научной статьи по специальности «Математика»

CC BY
64
7
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
КОРОТКОСТРОКОВЕ ПРОГНОЗУВАННЯ / КОМПЛЕКСУВАННЯ ПРОГНОЗНИХ ОЦIНОК / ПIДТРИМКА ПРИЙНЯТТЯ РIШЕНЬ / IНТЕРВАЛЬНИЙ АНАЛIЗ / SHORT-TERM PREDICTION / COMPLEXIFICATION OF FORECAST ESTIMATES / DECISION SUPPORT / INTERVAL ANALYSIS

Аннотация научной статьи по математике, автор научной работы — Romanenkov Yu., Danova M., Kashcheyeva V., Bugaienko O., Volk M.

Решена задача усовершенствования методической базы системы поддержки принятия решений в процессе краткосрочного прогнозирования показателей организационно-технических систем путем разработки новых и адаптации существующих методов комплексирования, способных учесть интервальную неопределенность экспертных прогнозных оценок. Актуальность данной задачи обусловлена необходимостью учета неопределенности первичной информации, вызванной проявлением НЕ-факторов. Проведен анализ предпосылок и особенностей формализации неопределенности первичных данных в интервальной форме, выявлены преимущества интервального анализа для решения задачи комплексирования интервальных прогнозных оценок. Изложены краткие сведения о базовом математическом аппарате: интервальной арифметике и интервальном анализе. Усовершенствованы методы комплексирования прогнозных оценок путем синтеза интервальных расширений, полученных в соответствии с парадигмой интервального анализа. В результате исследований установлено, что введение аналитической функции предпочтений позволило синтезировать модель комплексирования в достаточно общем виде, путем объединения в единой форме классов гибридных и селективных моделей для генерации консолидированных прогнозов на основе интервальных прогнозных оценок. Это позволяет получать комплексированные прогнозы на основе интервальных прогнозных оценок, тем самым обеспечивать точность консолидированного краткосрочного прогноза. Проведен критический анализ предложенных методов и разработаны рекомендации по их практическому применению. Сформулированы рекомендации по параметрической настройке аналитической функции предпочтений. На примере показаны адаптивные свойства интервальной модели комплексирования

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Complexification methods of interval forecast estimates in the problems on shortterm prediction

We solved the problem of improvement of methodological base for a decision support system in the process of short-term prediction of indicators of organizational-technical systems by developing new, and adapting existing, methods of complexification that are capable of taking into consideration the interval uncertainty of expert forecast estimates. The relevance of this problem stems from the need to take into consideration the uncertainty of primary information, predetermined by the manifestation of NON-factors. Analysis of the prerequisites and characteristics of formalization of uncertainty of primary data in the interval form was performed, the merits of interval analysis for solving the problems of complexification of interval forecast estimates were identified. Brief information about the basic mathematical apparatus was given: interval arithmetic and interval analysis. The methods of complexification of forecast estimates were improved through the synthesis of interval extensions, obtained in accordance with the paradigm of an interval analysis. We found in the course of the study that the introduction of the analytical preference function made it possible to synthesize the model of complexification in a general way, by aggregating the classes of hybrid and selective models in a single form for the generation of consolidated predictions based on interval forecast estimates. This allows obtaining complexification predictions based on the interval forecast estimates, thereby ensuring accuracy of the consolidated short-term prediction. Critical analysis of the proposed methods was performed and recommendations on their practical application were developed. Recommendations for parametric setting of the analytic function of preferences were stated. Using the example, the adaptive properties of the interval model of complexification were shown.

Текст научной работы на тему «Методы комплексирования интервальных прогнозных оценок в задачах краткосрочного прогнозирования»

-□ □-

Виршено завдання удосконалення мето-дичног бази системи тдтримки прийняття ршень у процеп короткострокового прогно-зування показнитв органiзацiйно-технiчних систем шляхом розробки нових i адаптацп кнуючих методiв комплексування, здат-них врахувати нтервальну невизначетсть прогнозних оцнок. Актуальтсть дано-го завдання обумовлена необхiднiстю вра-хування невизначеностi первинног нфор-мацп, викликаног проявом Н1-чиннитв. Проведений аналiз передумов i особливос-тей формалiзацn невизначеностi первин-них даних в ттервальнш формi, виявлет переваги ттервального аналiзу для вир^ шення задачi комплексування нтервальних прогнозних оцшок. Викладено коротт вгдо-мостi про базовий математичний апарат: штервальну арифметику та штервальний аналiз. Вдосконалено методи комплексування прогнозних оцток шляхом синтезу штервальних розширень, отриманих вгд-повгдно до парадигми штервального анал^ зу. В результатi дослгджень встановлено, що введення аналгтичног функцп переваг дозволило синтезувати модель комплексу-вання в досить загальному виглядi, шляхом об'еднання в единт формi клаыв гiбридних i селективних моделей для генерацп консо-лгдованих прогнозiв на основi штервальних прогнозних оцшок. Це дозволяе отримувати комплексоваш прогнози на основi штервальних прогнозних оцшок, тим самим забезпе-чувати точтсть консолгдованого коротко-строкового прогнозу. Проведено критичний аналiз запропонованих методiв i розроблено рекомендацп щодо гх практичного викори-стання. Сформульовано рекомендацп щодо параметричного налаштування аналтич-ног функцп переваг. На прикладi показано адаптивж властивостi ттервальног моделi комплексування

Ключовi слова: короткострокове про-гнозування, комплексування прогнозних ощ-нок, пгдтримка прийняття ршень, нтер-

вальний аналiз -□ □-

UDC 658.5:004.94

|DOI: 10.15587/1729-4061.2018.131939

COMPLEXIFICATION METHODS OF INTERVAL FORECAST ESTIMATES IN THE PROBLEMS ON SHORT-TERM PREDICTION

Yu. Romanenkov

Doctor of Technical Sciences, Associate Professor Department of management* E-mail: KhAI.management@ukr.net M. Danova PhD

Department of Software Engineering* V. Kashcheyeva PhD, Associate Professor Department of Finance* O. Bugaienko PhD

Department of Chemistry, Ecology and Expert Technologies*

M. Volk PhD, Associate Professor Department of Electronic Computers Kharkiv National University of Radio Electronics Nauky avе., 14, Kharkiv, Ukraine, 61166 M. Karminska-Bielobrova

PhD

Department of production organization and personnel management** O. Lobach PhD, Associate Professor Department of Strategic Management** *N. E. Zhukovsky National Aerospace University "Kharkiv

Aviation Institute" Chkalovа str., 17, Kharkiv, Ukraine, 61070 **National Technical University "Kharkiv Polytechnic Institute" Kyrpychova str., 2, Kharkiv, Ukraine, 61002

1. Introduction

Given the development of modern information and communication technologies, the arrays of heterogeneous data on the monitoring of organizational and technical systems (OTS), accumulated in specialized databases, mainly as temporal series, are continuously increasing. These data characterize the dynamics of multifactor processes that are difficult to formalize, and systems that have both subjective and objective uncertainty.

The desire to use the accumulated information to solve the problems of control over complex OTS leads to the necessity of its clearing and transformation in order to obtain

forecast estimates of the indicators, which are essential for making management decisions.

Realization of the problems of short-term prediction under modern conditions is impossible without using applied information technologies, in particular decision support systems (DSS). Modern DSS are multifunctional information technologies, possessing a vast base of models, methods and means for solving specific problems of control and decision making. The main functions of such technologies are correct transformation of primary management information to the form that is convenient for a decision maker (DM), as well as subsequent processing in order to assess and substantiate an integral management decision.

©

It is obvious that the range and complexity of the methods of specialized DSS directly depends upon the scale and technological complexity of OTS. A manufacturing company [1], an organization [2], a branch [3, 4], a state [5, 6] or even a global [7] economic system can be the object.

An equally important factor, determining the features and the nomenclature of methodical provision of DSS, is the nature of primary information, first of all, objective uncertainty. An important contradiction that exists in the process of decision making support under uncertainty should be taken into consideration. A DM seeks to reduce or completely remove uncertainty during problem-solving process. In turn, a DSS developer (analyst), by contrast, seeks to transform correctly and thereby preserve uncertainty of input data until reaching a decision-making point.

For this reason, there are a fairly large number of classic and derived decision-making criteria that take into consideration the attitude of DM to risk [8].

One of the directions of improvement of the methodic base of DSS [2] is to create new and adapt existing methods for processing information for taking into consideration the features of primary data. Solving a given topical scientific and applied problem will make it possible to improve the efficiency of automation of short-term prediction processes.

2. Literature review and problem statement

The process of short-term prediction typically includes a stage of an a priori estimation of parameters of the state of a decision-making object, which is characterized by a situation where an analyst has an access to predictive information from multiple sources (or obtained by different methods). This leads to the need to solve the problem of complexification of forecast estimates, received from several sources [9], and under conditions of objective uncertainty of primary data.

This problem can be attributed to the technologies of so-called "gray" management analysis [10], characterized by partial uncertainty of management information.

The requirement for completeness, timeliness and opti-mality of the resulting decision is transformed into the need to take into consideration multi-criteria and uncertainty of source information in the model of a decision-making problem [11].

The problem of complexification, first set by Laplas, is well substantiated and developed in papers [12, 13].

Since solving the problem of complexification does not imply, by definition, the uniqueness of a solution, modern publications propose several methods for the complexifica-tion of forecast estimates [14].

The methods can be both static [15, 16] and dynamic in character, that is, take into consideration the dynamics of accuracy of sources of predictive information [17]. A criterion of selection of complexification weight coefficients in all cases is the accuracy of sources, expressed in the form of statistical characteristics. It is obvious that effectiveness of a particular complexification method can be evaluated only a posteriori and depends on the capabilities of a complexifi-cation mechanism to adapt, as well as on the nature of the observed process.

In the case when it is not possible to obtain the pointwise forecast estimates as a result of expert examination, the use

of known methods for complexification becomes impossible. Reducing interval estimates to pointwise ones in this case does not always adequately take into consideration the specificity of a problem [18]. Comparison of interval alternatives through the utility function, reflecting inclination of DM to taking risks [19], does not make it possible to achieve decision commonality, translating uncertainty of source data into uncertainty of the form of the utility function itself.

Paper [20] proposed the method for direct comparison of interval magnitudes. It allows selecting the best interval estimate based on preferences of a decision maker. In the case of incomparableness, we recommend giving up making decision for a while due to the danger of committing an error of the second kind.

That is why there is the need to develop methods for processing interval data in order to take into consideration uncertainty until a decision is made.

Thus, an analysis of publications showed that the search for ways to adapt existing methods of complexification to the interval form of forecast estimates is a relevant task.

3. The aim and objectives of the study

The aim of present research is to develop new, and adapt existing, methods of complexification of forecast estimates, capable to take into consideration the uncertainty of primary data. This will make it possible to improve the methodical base of DSS in the process of short-term prediction.

To accomplish the aim, the following tasks have been set:

- to analyze the prerequisites and patterns in the for-malization of uncertainty of primary expert data in the interval form;

- to improve methods of forecast estimates complexi-fication through interval expansion, which would allow obtaining consolidated predictions based on the interval forecast estimates;

- to perform critical analysis and develop recommendations regarding practical application of the proposed methods.

4. Analysis of prerequisites and patterns in the formalization of primary data uncertainty in the interval form

The concept of uncertainty in modern science is inextricably associated with the process of operation of any non-isolated systems under actual conditions. This fact led to the emergence of a modern paradigm, in accordance with which uncertainty is considered as a fundamental property of the system itself and not only of the external environment.

Sources of uncertainty are predetermined by so-called NON-factors [21] and define a variety of its forms (Fig. 1).

The approaches applied to processing data with uncertainty can be divided into three main groups [23] (Table 1):

- probabilistic-statistical approach;

- approach based on a fuzzy set theory;

- approach based on the interval analysis.

Selection of the appropriate approach is predetermined by the uncertainty sources and the form of representation of primary data.

UNCERTAINTY

Fig. 1. Classification tree of uncertainty [22]

Table 1

Mathematical apparatus of formalization of data uncertainty

Knowledge about the process Mathematical apparatus

Complete parametric uncertainty Functional (symbol) analysis

Boundaries of parameters are known Interval analysis

Membership functions of parameters are known Fuzzy sets theory

Statistical characteristics of parameters are known Probabilistic-statistical analysis

Complete parametric certainty Numerical analysis

5) the apparatus of interval analysis proved its effectiveness in solving different scientific and practical tasks [27];

6) interval algorithms typically do not require specialized tools for software implementation.

5. Brief information on interval arithmetic and interval analysis

We imply by interval [a] = JJa, a J a closed limited subset R of the form

The basis for considering the expert forecast estimates in the interval form is formed by the following circumstances:

1) in the process of short-term prediction, estimates in the interval form can be synthesized in a natural way, that is, as a result of fulfilling a prediction task [24];

2) results of measuring the parameters of the system, direct or indirect, performed with errors (strictly speaking, results of all measurements), can be represented in the interval form [25];

3) if there is at least one model parameter in the interval form in a model, all parameters of the model must be reduced to the interval form as the least complex form of description of parametric uncertainty in order to observe data homogeneity;

4) interval models are more preferable than the probabilistic-statistical ones in the case of making one-moment single decisions [26];

JJa, aj = {x eR |a < x < a} [28],

which can be described by the following characteristics: a, inf [a] is the left end of interval [a]; a, sup[a] is the right end of interval [a];

lid [a] =

a + a 2

is the middle (median) of interval [a]; wid [a] = a - a is the width of interval [a]. For the two intervals

[a ] = JJa, a J and [b] = JJb, bjj

in classical interval arithmetic ([a],[b]e7S), the following operations were assigned:

[a] + [b] = [a + b, a + b]; (1)

[a]-[b] = [a - b, a - b]; (2)

[a][b] = [min{abb,abb,ab,abb}, max{abb,ab,ab,ab}J; (3)

M/[bH4[i/b, 1b], o «[b]. (4)

Interval arithmetic operations have the following properties:

([a] + [b]) + [c] = [a] + ([b] + [c]); (5)

([a ][b])[c] = [a ]([b][c ]); (6)

[a ] + [b] = [b] + [a ]; (7)

[a ][b] = [b][a ]; (8)

([a ] + [b])[c ]c[a ][c] + [b][c ]. (9)

The distance between two intervals [a],[b] e IR is deter-

a researcher to move further than when using membership functions of arbitrary form [29].

11, x< x< x l0, x < x, x > x

mined by magnitude

dist ([a ],[b]) = max {| a -b\, \a-b\} = p([a],[b]) (10)

and have the following properties:

dist ([a],[b])> 0; (11)

dist([a],[b]) = 0, when [a] = [b]; (12)

dist ([a],[b]) = dist ([b],[a]); (13)

dist ([a],[c])< dist ([a],[b]) + dist ([b],[c]). (14)

The key difference between classical interval arithmetic and interval analysis is in the following.

In classic interval arithmetic, the distribution law is not observed, there are no inverse elements, similar terms cannot be reduced within its frameworks [27]. This leads to that the technique of symbol transformations is lost during formalization of operations with intervals.

The main objective of interval analysis, by contrast, is not automation of computing, but rather finding the region of possible result values, taking into consideration structures of functions and data, assigned in symbolic form.

Within this approach, interval magnitudes are considered at the intermediate stages of calculations and analysis. Only at the last stage of decision-making, if necessary, they are transformed into pointwise solutions. It will make it possible to give the possibility to save completeness of information on the set of possible solutions up to the last moment [25].

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

From the formal point of view, description of uncertainty by the interval is a particular case of its description by a fuzzy set. In interval analysis, membership function of a fuzzy set has a specific form - it is equal to 1 in some interval, and to 0 outside it (Fig. 2).

Such two-parameter membership function is described by only two parameters (interval boundaries). Such a simplicity of description makes the mathematical apparatus of interval analysis more transparent than the apparatus of fuzziness theory in the general case. This, in turn, allows

Fig. 2. Membership function of an interval number

6. Development of interval extensions of the methods for complexification of forecasting estimates

Problem statement. Let at the current moment t=T a researcher has available interval forecasting estimates of OTS parameter for moment t=T+1, obtained from different sources (or by different methods), N in total:

[ x i ] = [ X

, i = 1,..., N.

(15)

It is required to synthesize a consolidated interval forecast estimate by the complexification of interval estimates of sources.

Assumptions and constraints. Let us state main assumptions and constraints of the set problem:

1. The width of intervals does not exceed 20 % of the men values of intervals, which corresponds to an actual technical task for examination. This prevents the degeneration of a mathematical model of complexification from the interval model into a purely analytical one with complete parametri-cal uncertainty (Table 1).

2. The basis is a paradigm of the interval analysis, taking into consideration, in addition to the rules of classic interval arithmetic, the physical sense and logic of analytic transformations of the mathematical model of complexification.

3. It is necessary to ensure that each of the developed interval methods of complexification in an extreme case (at narrowing down interval estimates to pointwise ones) should be reduced to the appropriate method of complexifi-cation of point estimates.

4. Forecast estimates are considered non-biased until the opposite is substantiated.

5. The history of multiple evaluation is available for accumulation and statistical processing.

6. Consolidated interval forecast estimate belongs to a set of super-positions of initial particular estimates:

[ x = [ x i]

(16)

where wi is the weight factors of the complexification model, i=1,...,N.

Approaches to solution. Let us discuss the approaches to solving a problem of interval expansion of the methods for complexification.

1. Averaging of forecast estimates. Obviously, the simplest variant is the selection of weight factors wi equal to:

w- =—, i = 1,..., N.

' N

(17)

In this case

Consolidated interval forecast estimate in this case is equal to

[ x %]=nn % [ X - ].

It is obvious that at

(18)

wid [ Xi ]-

>0, ¿=1,..., N,

[A-] = 4=r-[* ]l = [4, A-], - = N,

(19)

s, a = b S2

/ y

- - -

M

Fig. 3. Graphical interpretation of the problem of comparing two interval numbers (according to Voshchinin)

Ratio of areas ¿1

=

S1 + S2

and |i2 =

S + S2

is proposed to consider as the level of reliability of hypotheses H1: a > b and H2: a < b , respectively.

5 =

1 (a - b), a<b,

(a - a)^1 (a + a)-bj, a>b, a<b,

___ 1 _ 2 _ _

(a - a)(b - b)-1 (b - a) , a>b, S2 = (a - a)(b - b)- S1.

(20)

formula (18) is reduced to the formula for the simple mean [9, 17].

2. Weighed complexification. In the case when we know the results of previous estimation, that is, the magnitude of absolute deviations in the interval form for moment t=T

It is obvious that magnitudes and |i2 act as the measure of validity of hypotheses on the reciprocal arrangement of two numbers within the intervals, however, they cannot be used as a quantitative measure of relations between these numbers.

In addition, it should be taken into consideration that

|i(H: a >b)^|i(H: a -b >0),

(21)

the problem of quantitative comparison of interval numbers (intervals) occurs.

The essence is to determine a quantitative measure of preference of one interval number to the other. Application of classic interval arithmetic in this case does not eliminate the problem, but rather aggravates it, since the difference between interval numbers is an interval number. Strictly speaking, the difference between the two double-parametri-cal mathematical objects can be expressed as an object with the number of parameters that is not less than two.

However, prospects for practical application of interval analysis compel researchers to seek approaches to solving this problem.

For example, the author of paper [27] formalized the problem of comparing interval numbers as follows.

Let us defer on coordinate axes the sections, corresponding to intervals

[a ] = [a, a J and [b] = [b, bj, besides, for certainty [b] = [b, b],

that is, "extra" arithmetic operations with interval numbers can distort the result.

The second way was proposed in papers [20, 30] and is associated with correction of the interval logic. By generalizing certain close, but not strictly identical, variants of logical relations between interval numbers, it is possible to obtain a coherent logical system, which, however, fails in some particular cases.

Another option for lax formalization of the problem of comparing interval numbers is to use the magnitudes of the distance between interval numbers as a comparison measure (10). In this case, it becomes fundamentally possible to construct and analyze the graph with interval numbers in vertices, however lax compliance with distribution logic makes practical application of this approach difficult.

The proposed approach. To determine the quantitative measure of proximity of interval errors to zero, we will introduce an even, monotonically decreasing function non-negative on the entire real axis (Fig. 4).

'(4

A

Fig. 4. Example of function of preferences of DM relative to absolute forecast error

The function shown in Fig. 5 represents the preferences of DM relative to values of forecast errors. Having a particular form of dependence u (A), it becomes possible to introduce a quantitative indicator of proximity of interval estimate to zero. The height of the rectangle, equivalent by the area to a certain interval of function u(A) on the error interval can be accepted as a quantitative indicator (Fig. 5):

u

0

1 A

=-J u (A) dA.

Fig. 5. Graphical interpretation of the measure of proximity of an interval estimate to zero

It is obvious that in this case a set of interval errors (19) can be ranged quantitatively with the help of indicators u*, i=1,..., N.

After normalization of indicators u*, we will obtain weight factors wi of the system of complexification as follows:

-, i = 1,..., N.

It is possible to make sure that at

u (A) = At

and at

wid

[ - ]-

>0,

Application of the interval model of exponential smoothing is probably more promising within the framework of the examined problem [32].

In both cases, as a result of analysis of temporal series of interval data, the interval estimation of variance at moment t=T+1 can be obtained. Having applied the above described comparison procedure, based on the selection of preference function u (A2), it is possible to determine complexification coefficients, taking into consideration the dynamics of accuracy of sources.

It is easy to make sure that when choosing the procedure described in [32] at

'(A ' )4

and

wid [ x i ] -

0.

(23)

Interval variant of dynamic complexification is reduced to pointwise one [17].

The proposed met0ods can be inclu0ed in methodological support of DSS (Fig. 7).

A

factors (23) appear to be equal to the correspondent factors for pointwise weighed complexifica-tion [9, 17].

Fig. 6 shows some forms of dependence u (A), among which a DM can select a suitable one for a particular study.

Fig. 6, c shows that selection of the appropriate form makes it possible to exclude from the complexification model the sources that do not ensure the error, which is above the assigned value A*, thus this transfers a complexification model into a type of selective models [15].

3. Dynamic complexification. In the situations, when estimation is carried out periodically, there arises the possibility to accumulate and evaluate the error statistics for each source, thus to take into consideration dynamics in accuracy of each source. The idea of this approach for pointwise forecast estimates is outlined in [2, 17] and allows interval extension.

To do this, we will form for each source a temporal series from interval values of variances of prediction values {[A2 ]} where i is the number of the time interval, j is the source number. *

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

By analyzing the temporal series of interval values of variances of forecast estimates, it is possible to identify trends in the dynamics of variance for forecast estimates for each source. To solve this problem, the mathematical apparatus can be applied, described, for example, in [31], for constructing the interval-statistical models.

k (A-IA).

0, \A>A

Fig. 6. Forms of functions of DM preferences relative to prediction errors: a — inversely proportional; b — inversely quadratic; c — piecewise-linear

Software iA>liAentation of the proposed methods does not cause any difficulty and can be conducted even in a spreadsheet editor.

An example of the complexification of estimates, obtained from Ae sources, is given in Table 2.

Table 2 shows that a consolidated forecast estimate shifts towar ds the more accurate sources. Thus, the property of adaptability of a complexification model manifests itself. The model is adjusted at each step of prediction. Structural setting involves selection of the form of a DM preference function; parametric setting involves selection of the coefficients of the function itself.

u

u

w=

u

u

u

1

0

0

b

0

c

Fig. 7. Process diagram of methods for the complexification of interval forecast estimates in the course of short-term

prediction of OTS indicators

Table 2

Example of complexification of forecast estimates from five sources

Source number i 1 2 3 4 5

Forecast estimates at moment t=T+1 [9, 10] [7, 8] [8, 10] 7 [4, 5]

Average consolidated forecast estimate at moment t=T+1 [7, 8]

Forecast estimates at moment t=T [9, 10] [7, 8] [6, 8] [4, 5] 6

Factual value at moment t=T 8.1

Absolute interval error at moment t=T [-1.9, - 0.9] [0.1, 1.1] [0.1, 2.1] [3.1, 4.1] 2.1

u (A) = ^

u* 0.75 2.40 1.52 0.28 0.48

m 0.138 0.442 0.281 0.052 0.087

Consolidated forecast estimate at moment t=T+1 [7.29, 8.52]

u (A) = -^

u* 0.58 9.09 4.76 0.079 0.23

Wi 0.040 0.617 0.323 0.005 0.015

Consolidated forecast estimate at moment t=T+1 [7.36, 8.67]

u (A) = |2-W'|A|< 2 [0, |A|> 2

u* 0.60 1.40 0.90 0 0

Wi 0.207 0.482 0.311 0 0

Consolidated forecast estimate at moment t=T+1 [7.72, 9.04]

7. Discussion of results: critical analysis and recommendations on the practical implementation of the proposed methods

First, it is necessary to note the following features that manifest themselves when using the proposed methodological apparatus.

1. Strictly speaking, the complexification operation is effective only when estimates of all sources are non-biased. However, it is not possible to ensure it in most practical problems, apart from the cases when the methodical base of the source is open. However, adaptive features of the com-plexification system allow gaining in accuracy for the case of the repeated use of a complexification procedure [17].

2. The important problem of sources independence is beyond the scope of this paper. This issue should be attributed obviously to the short-term prediction technology itself. We will note that the technology for the estimation of OTS parameters implies a preliminary data analysis only, within which it is appropriate to verify the independence of sources (that is, to measure the correlation between their estimates).

3. When using the interval values of variances, it is necessary to remember that at squaring a zero-containing interval number, accuracy is lost arithmetically. In this case, it seems promising to divide interval estimates into two groups in order to conduct separate analysis of zero-containing intervals.

4. In the case of using a preference function и(Д) with a vertical asymptote (for example, Fig 7, a, b), it should be taken into consideration that its property

lim u (Д) = lim u (Д) = ~ (24)

makes indicator u' for the intervals, containing zero, non-informative.

In practice, this effect can be easily counterbalanced either analytically (by an argument shift), or algorithmically (by selecting the other kind of preference function ^Д) for zero-containing intervals).

The merits of the developed methods include their following features:

1. The proposed mathematical apparatus makes it possible to synthesize the model of complexification in a general way, aggregating the classes of hybrid [33] and selective models in a single analytical form.

2. Implementation of algorithms that realize the proposed methods is easy and its results are visual, which is important in the process of managerial decision-making.

8. Conclusions

1. Approaches to processing data with uncertainty were analyzed. It was shown that most problems of short-term prediction with uncertain original data can be formalized in the interval form. The advantages of interval analysis for solving problems of complexification of interval forecast estimates were identified. In particular, the paradigm of interval analysis allows taking into consideration, in addition to the rules of classical interval arithmetic, the logic and physical sense of analytic transformations of a mathematical model of complexification.

2. Interval extensions of the methods for complexifi-cation of forecast estimates were obtained. They make it possible to get consolidated predictions based on interval forecast estimates and to decrease the procedure of expert estimation. The analytic function of DM preferences was proposed, which allows aggregating the classes of hybrid and selective prediction models in a single analytical form.

3. Recommendations on the practical implementation of the proposed methods were compiled. Specifically, recommendations for parametric setting of preference functions depending on the location of interval estimates were formulated. By using the example, it was shown how a consolidated forecast estimate is shifted towards more accurate sources, which illustrates the adaptive properties of the interval models of complexification.

References

1. Oklander M. A., Yashkina O. I. Kontseptsiya formuvannia systemy marketynhovykh doslidzhen innovatsiy mashynobudivnoho pidpryiemstva // Ekonomist. 2013. Issue 11 (325). P. 52-56.

2. Romanenkov Y., Vartanian V. Formation of prognostic software support for strategic decision-making in an organization // Eastern-European Journal of Enterprise Technologies. 2016. Vol. 2, Issue 9 (80). P. 25-34. doi: 10.15587/1729-4061.2016.66306

3. Yashkina O. Natsionalna systema stratehichnykh marketynhovykh doslidzhen naukovo-tekhnolohichnoho rozvytku // Ekonomist. 2013. Issue 1. P. 26-29.

4. Malitskyi B. A., Popovych O. S., Soloviov V. P. Metodychni rekomendatsiyi shchodo provedennia prohnozno-analitychnoho doslid-zhennia v ramkakh Derzhavnoi prohramy prohnozuvannia naukovo-tekhnolohichnoho ta innovatsiynoho rozvytku Ukrainy. Kyiv: Feniks, 2004. 52 p.

5. Shostak I. V., Danova M. A., Romanenkov Yu. A. Informacionnaya tekhnologiya podderzhki prinyatiya ekspertnyh resheniy v na-cional'nyh Forsayt-issledovaniyah // Komunalne hospodarstvo mist. 2015. Issue 123. P. 58-67.

6. Cuhls K. Foresight in Germany // The Handbook of Technology Foresight. Cheltenham: Edward Elgar, 2008. P. 131-153.

7. Johnston R., Sripaipan C. Foresight in Industrialising Asia // The Handbook of Technology Foresight. Cheltenham: Edward Elgar, 2008. P. 333-356.

8. Bugas D. Modelling the expert's preferences in decision-making under complete uncertainty // Eastern-European Journal of Enterprise Technologies. 2016. Vol. 5, Issue 4. P. 12-17. doi: 10.15587/1729-4061.2016.81306

9. Bidyuk P. I., Gasanov A. S., Vavilov S. E. Analiz kachestva ocenok prognozov s ispol'zovaniem metoda kompleksirovaniya // Sys-temni doslidzhennia ta informatsiyni tekhnolohiyi. 2013. Issue 4. P. 7-16.

10. Buravcev A. V. Seriy upravlencheskiy analiz // Perspektivy Nauki i Obrazovaniya. 2017. Issue 5 (29). P. 74-79.

11. Kryuchkovskiy V. V., Usov A. V. Teoreticheskiy analiz interval'noy neopredelennosti poleznosti resheniy // Trudy Odesskogo po-litekhnicheskogo universiteta. 2009. Issue 1-2. P. 180-186.

12. Bates J. M., Granger C. W. J. The Combination of Forecasts // Journal of the Operational Research Society. 1969. Vol. 20, Issue 4. P. 541-568. doi: 10.1057/jors.1969.103

13. Newbold P., Granger C. W. J. Experience with Forecasting Univariate Time Series and the Combination of Forecasts // Journal of the Royal Statistical Society. Series A (General). 1974. Vol. 137, Issue 2. P. 131. doi: 10.2307/2344546

14. Sineglazov V. M., Chumachenko E. I., Gorbatyuk V. S. Metod resheniya zadachi prognozirovaniya na osnove kompleksirovaniya ocenok // Induktyvne modeliuvannia skladnykh system. 2012. Issue 4. P. 214-223.

15. Vasil'ev A. A. Ob'edinenie prognozov ekonomicheskih pokazateley na osnove bives-ocenki s vesovoy funkciey H'yubera // Aktu-al'nye problemy gumanitarnyh i estestvennyh nauk. 2015. Issue 10-4. P. 44-47.

16. Molev M. D., Zanina I. A., Stuzhenko N. I. Sintez prognoznoy informacii v praktike ocenki ekologo-ekonomicheskogo razvitiya regiona // Inzhenerniy vestnik Dona. 2013. Issue 4. P. 59.

17. Romanenkov Yu. A., Vartanyan V. M., Revenko D. S. Kompleksirovanie prognoznyh ocenok v sisteme monitoringa pokazateley sostoyaniya biznes-processa // Systemy upravlinnia, navihatsiyi ta zviazku. 2014. Issue 2. P. 94-101.

18. Sternin M. Yu., Shepelev G. I. Sravnenie poliinterval'nyh ocenok v metode OIO // Intelligent Support of Decision Making. International book series «Information science & computing». 2009. Issue 10. P. 83-88.

19. Shepelev G. I., Sternin M. Yu. Ob adekvatnosti tochechnyh kriteriev zadacham ocenki i sravneniya interval'nyh al'ternativ // Iskusstvenniy intellekt i prinyatie resheniy. 2014. Issue 2. P. 78-88.

20. Levin V. I. Uporyadochenie intervalov i zadachi optimizacii s interval'nymi parametrami // Kibernetika i sistem. analiz. 2004. Issue 3. P. 14-24.

21. Nechetkie gibridnye sistemy: teoriya i praktika / Batyrshin I. Z. et. al.; N. G. Yarushkina (Ed.). Moscow: Fizmatlit, 2007. 207 p.

22. Kamolov E. R. Osnovnye vidy i tipy neopredelennosti informacii, harakternye dlya slozhnyh biotekhnologicheskih sistem // Molodoy ucheniy. 2017. Issue 27. P. 36-39.

23. Shishin V. V. Metody analiza eksperimental'nyh dannyh s razlichnymi vidami neopredelennosti // Sovremennye problemy nauki i obrazovaniya. 2012. Issue 6. P. 11.

24. Revenko D. S., Vartanyan V. M. Razrabotka metoda formirovaniya interval'nyh dannyh na osnove ekspertnoy informacii // Ekono-mika ta upravlinnia pidpryiemstvamy mashynobudivnoi haluzi. 2009. Issue 1. P. 24-30.

25. Bardachev Yu. N., Kryuchkovskiy V. V., Malomuzh T. V. Metodologicheskaya predpochtitel'nost' interval'nyh ekspertnyh ocenok pri prinyatii resheniy v usloviyah neopredelennosti // Visnyk Kharkivskoho natsionalnoho universytetu imeni V. N. Karazina. Seriya: Matematychne modeliuvannia. Informatsiyni tekhnolohiyi. Avtomatyzovani systemy upravlinnia. 2010. Issue 890. P. 18-28.

26. Podruzhko A. A., Podruzhko A. S., Kiricev P. N. Interval'nye metody resheniya zadach kalibrovki i klassifikacii // Trudy Instituta sistemnogo analiza Rossiyskoy Akademii nauk. Dinamika neodnorodnyh sistem. 2009. Vol. 44. P. 173-186.

27. Voshchinin A. P. Interval'niy analiz dannyh: razvitie i perspektivy // Zavodskaya Laboratoriya. 2002. Vol. 68, Issue 1. P. 118-126.

28. Kochkarov R. A. Interval'nye zadachi na predfraktal'nyh grafah // Novye informacionnye tekhnologii v avtomatizirovannyh siste-mah. 2015. Issue 18. P. 255-264.

29. Orlov A. I. Osnovnye idei statistiki interval'nyh dannyh // Nauchnyy zhurnal KubGAU. 2013. Issue 94. P. 55-70.

30. Levin V. I. Sravnenie intervalov i optimizaciya v usloviyah neopredelennosti // Vestnik Tambovskogo universiteta. Seriya: Estest-vennye i tekhnicheskie nauki. 2002. Issue 3. P. 383-389.

31. Vartanyan V. M., Shah L. G., Romanenkov Yu. A. Postroenie i analiz interval'nyh nestatisticheskih modeley // Tekhnologicheskie sistemy. 2003. Issue 3 (19). P. 19-24.

32. Parametricheskiy sintez modeli eksponencial'nogo sglazhivaniya dlya statisticheskih ryadov interval'nyh dannyh / Vartanyan V. M., Romanenkov Yu. A., Kashcheeva V. Yu., Revenko D. S. // Otkrytye informacionnye i komp'yuternye integrirovannye tekhnologii. 2009. Issue 44. P. 232-240.

33. A Hybrid Short-Term Forecasting Model of Passenger Flow on High-Speed Rail considering the Impact of Train Service Frequency / Lai Q., Liu J., Luo Y., Ma M. // Mathematical Problems in Engineering. 2017. Vol. 2017. P. 1-9. doi: 10.1155/2017/1828102

i Надоели баннеры? Вы всегда можете отключить рекламу.