Научная статья на тему 'wEscore: quality assessment method of multichannel image visualization with regard to angular resolution'

wEscore: quality assessment method of multichannel image visualization with regard to angular resolution Текст научной статьи по специальности «Компьютерные и информационные науки»

CC BY
56
38
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
hyperspectral image visualization / decolorization / Escore / local contrast

Аннотация научной статьи по компьютерным и информационным наукам, автор научной работы — D.S. Sidorchuk

This work considers the problem of quality assessment of multichannel image visualization methods. One approach to such an assessment, the Escore quality measure, is studied. This measure, initially proposed for decolorization methods evaluation, can be generalized for the assessment of hyperspectral image visualization methods. It is shown that Escore does not account for the loss of local contrast at the supra-pixel scale. The sensitivity to the latter in humans depends on the observation conditions, so we propose a modified wEscore measure which includes the parameters allowing for the adjustment of the local contrast scale based on the angular resolution of the images. We also describe the adjustment of wEscore parameters for the evaluation of known decolorization algorithms applied to the images from the COLOR250 and the Cadik datasets with given observational conditions. When ranking the results of these algorithms and comparing it to the ranking based on human perception, wEscore turned out to be more accurate than Escore.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «wEscore: quality assessment method of multichannel image visualization with regard to angular resolution»

wEscore: quality assessment method of multichannel image visualization

with regard to angular resolution

D.S. Sidorchuk1

1 Institute for Information Transmission Problems of Russian Academy of Sciences (Kharkevich Institute), 127051 Moscow, Bolshoy Karetnypereulok 19, Russia

Abstract

This work considers the problem of quality assessment of multichannel image visualization methods. One approach to such an assessment, the Escore quality measure, is studied. This measure, initially proposed for decolorization methods evaluation, can be generalized for the assessment of hyperspectral image visualization methods. It is shown that Escore does not account for the loss of local contrast at the supra-pixel scale. The sensitivity to the latter in humans depends on the observation conditions, so we propose a modified wEscore measure which includes the parameters allowing for the adjustment of the local contrast scale based on the angular resolution of the images. We also describe the adjustment of wEscore parameters for the evaluation of known decolorization algorithms applied to the images from the COLOR250 and the Cadik datasets with given observational conditions. When ranking the results of these algorithms and comparing it to the ranking based on human perception, wEscore turned out to be more accurate than Escore.

Keywords: hyperspectral image visualization, decolorization, Escore, local contrast.

Citation: Sidorchuk DS. wEscore: quality assessment method of multichannel image visualization with regard to angular resolution. Computer Optics 2022; 46(1): 113-120. DOI: 10.18287/2412-6179-CO-911.

Acknowledgements: This work was supported by Russian Science Foundation (Project No. 2061-47089).

Introduction

We consider the multichannel image visualization (MIV) problem: the input multichannel image should be converted to a single or three-channel image which preserves the maximum information while allowing the resulting image to be perceived by the human eye. The MIV problem statement allows any number of channels to be input, what makes it to be a generalization of such problems as the hyperspectral image visualization, where the original channels are in the hundreds, multispectral image visualization, where the original channels are in the tens, and decolorization, where three RGB channels are input, and a single-channel grayscale image is the output.

The MIV is relevant, for example, in remote sensing (RS) images processing: along with the development of automatic methods, the visual evaluation of multichannel images by humans remains necessary [1]. There is a special software for RS images analysis, which includes modules for MIV [2, 3]. The MIV is important in mass spectrometry imaging: researchers need a tool to obtain a unified visual representation of the molecular and structural organization of the tissues under study based on the large sets of images of molecular distributions [4, 5]. Another important field of MIV application is medicine. For example, in [6], the MIV method was proposed, which helps a specialist to differentiate cancer lesion on human skin.

The early MIV methods were based on an independent pixel-by-pixel transformation from the original multidimensional space to the resulting space of lower dimensionality [7]. Linear dimensionality reduction methods

such as PCA [7, 8] were employed to find such transformations. The preservation of local contrast in such methods is inherently limited [9], i.e., any contrast detail can be completely merged with the background in the resulting image. To solve this problem, another class of methods has been introduced. This class preserves local contrasts by constructing the resulting image as a whole based on a gradient field [10 - 13].

Modern MIV methods are based on nonlinear dimensionality reduction techniques that define transformations for all pixels in a single process of a functional optimization. The latter allows for taking into account, among other things, the spatial location of the pixels. For example, t-SNE [14], UMAP [5], and manifold alignment [15], etc. [16, 17] are used as such techniques.

When developing image processing methods, including MIV, formal criteria are required to evaluate the quality of their performance. To evaluate MIV methods, psychophysiological experiments are carried out [18 - 20]. But this approach is labor-intensive and is not applicable when developing a new method: the development process involves many comparison iterations of its different versions. There is no single standard for automatic quality assessment of MIV methods [21]. The common approach is based on the calculation of various statistics for the resulting images, such as entropy [21 - 23], standard deviation, an average of the absolute value of gradient [23], etc. [22]. In works that employ such an approach, it is assumed that the greater the value of the computed statistic, the more information is stored in the resulting image. However, the maximum values in this method can corre-

spond to a noisy image [24]. Another approach relevant for hyperspectral and multispectral visualization is to compare the result of the visualization to a reference image (full-reference image quality assessment) which consists of the RGB channels of the same scene obtained independently [21, 25]. A significant disadvantage of this method is the penalty for preserving in the resulting image the boundaries of objects which are indistinguishable in the RGB range but are distinguishable in hyperspectral range. In fact, the preservation of such objects is a useful feature of visualization, so it should positively affect the evaluation of the algorithm.

Now let us consider separately the problem of evaluating decolorization methods. Decolorization, having its own relevance, is also useful for studying other MIV evaluation problems. Unlike in MIV, both input and output images of decolorization algorithms definitely can be perceived by the human visual system, making it much easier to work with such images.

Turning to the techniques proposed specifically for assessment of decolorization methods, first let us consider Color Contrast Preserving Ratio (CCPR) [19]. CCPR approximates human vision preferences by the cardinality ratio of two sets: the set of pixel pairs contrasting in the original color image, and its subset which includes pixels contrasting in the resulting image. Thus, the recall of contrast preservation between pixels is evaluated.

However, the parasitic contrast which can appear as a result of visualization is not estimated. Hence, the maximum CCPR value can correspond to a noisy image, same as the maximum value for statistical methods of MIV estimation (entropy, etc.) as discussed previously. Later, taking into account this disadvantage, the Escore was proposed [18]. It was introduced as the harmonic mean of CCPR and the new CCFR (Color Content Fidelity Ratio) measure. The latter evaluates the precision of contrast preservation between the pixels. The Escore values better match human visual perception, and presently this measure is actively employed [26 - 28].

The Escore measure can be generalized to the case of more input and output channels, so it should be considered not only for the decolorization algorithms assessment but also for the evaluation of any MIV method in the future. Our previous study of the Escore showed that the insufficient evaluation of the local contrast preservation in Escore results in estimations that contradict human perception for some visualizations [9]. Hence, in [9], we proposed a modified quality measure, which includes the assessment of local contrast preservation based on the calculation of the difference between the neighboring pixels.

In this work, we shall continue our study begun in [9]. We will show that while the idea of paying more attention to local contrast was correct, the way it is evaluated when assessing the quality of MIV methods must be corrected. This is because human vision perceives local contrasts at different scales, depending on the angular resolution of the observed image. For example, a person ob-

serving an image displayed at a distance of 60 cm on a modern monitor at a one-to-one scale is able to perceive the contrast not only between the neighboring pixels but also between the pixels spatially separated from each other. Under such conditions, the estimation of MIV algorithms based on the difference between only neighboring pixels can contradict the perceived image quality. We propose a new parameterized modification of the Escore that allows for the adjustment of the local contrast scale according to the angular resolution of the images in the considered problem. The work describes the adjustment of parameters for images with an angular resolution of 115 pixels per degree. Based on the data of the psychophysiological VH experiment [29], a comparison of the new adjusted measure and the previous modifications of Escore was performed. The work shows that the ranking based on the new modification more accurately corresponds to the ranking based on human perception than other measures.

1. Escore state-of-the-art

The Escore [18] decolorization quality measure is based on two values: Color Contrast Preserving Ratio (CCPR) and Color Content Fidelity Ratio (CCFR). CCPR is the ratio between the number of truly contrasted pixels (i.e. pixels contrasting both in the resulting and input images) and the number of all pixels contrasted in the resulting image. CCFR is the ratio between the number of truly contrasted pixels and the number of all pixels contrasted in the input image. In order to write it formally, let us denote the set of contrasting pixel pairs in the original image I as r = {(p, q):Ac (p, q) > k}, where k is an adjustable contrast threshold, Ac (p, q) - is the color contrast value calculated in CIE Lab color space [30]:

Ac (p, q) =yj(Lp - Lq )2 + (ap - aq )2 + (bp - bq )2. (1)

Let us denote the set of contrast pixel pairs in the resulting image G as 8 = {(p, q):Ag (p, q) >k}, where Ag ( p, q) = |Gp - Gq| - is a value of grayscale contrast calculated as the difference between values at pixel p and at pixel q of the grayscale image G. Then, CCPR and CCFR are expressed as follows:

CCPR( I, G) =

CCFR( I, G) =

#(8oT)

# r , # (8nr) #8 ,

(2) (3)

where symbol # denotes the cardinality of a set. The Escore measure is defined as the harmonic mean of CCPR and CCFR:

Escore (I, G) =

2 • CCPR (I, G) • CCFR (I, G) CCPR (I, G) + CCFR (I, G) '

(4)

The Escore measure introduced in these terms can be easily modified for the quality assessment of hyperspec-

tral image visualization. For this, when introducing the set of originally contrast pixels r, instead of color contrast Ac (p, q) the hyperspectral contrast Am (p, q) should be used, e.g. Euclidean measure. This issue deserves a separate study and is not the subject of the current research.

In the original work [18], the selection of contrast pairs (p, q), forming the sets r and 0 is not considered. In the code, provided by the authors, the calculations are performed as follows. For CCFR, pairs are formed from the neighboring pixels, and for CCPR pairs are randomly generated from the arbitrarily arranged pixels.

However, the accuracy of the grayscale visualization method in terms of preserving the contrast of distant pixels is of less importance than the local contrast preservation. This is inferred from the fact that the ability of the human visual system to differentiate between various levels of an achromatic stimulus is reduced when the boundary between areas of different levels is blurred [31]. On the other hand, the accuracy of the visualization methods in terms of the preservation of any spectrally close pixels (consistency) in the original image is not compatible with the property of local contrast preservation [9]. This means that for any consistent method, even sharp boundaries can be lost in the resulting visualization. Thus, any visualization method which is accurate in terms of consistency is potentially inaccurate in terms of the local contrast preservation.

Hence, the CCPR measure of contrast preservation calculated based on the pairs of arbitrarily located pixels is irrelevant to the perceived visualization quality. To address this, a modified measure of quality, hereinafter referred to as dEscore, was proposed in [9]. This measure differs from Escore as follows: the former applies the rule of pixel pairing based on neighboring pixels not only to CCFR but also to CCPR. According to this rule, the first pixel p with coordinates (x, y) within the pair is selected among all pixels sequentially, and the second pixel within the pair is the neighboring pixel either to the right q = (x + 1,y), or to the bottom q = (x,y + 1). Thus, each pixel of the image, with the exception of the boundary pixels, generates two pairs of ( p, q). It has been shown that the proposed modification improves the estimation of the local contrast preservation for some visualizations.

Continuing the study begun in [9], it was found that the evaluation based on pairs of neighboring pixels does not always agree with the perceived quality of visualization. For example, consider fragments of the visualization of a color image in fig. 1. In fig. 1b, only the contrast between neighboring pixels is preserved, and in fig. 1c the contrast at a larger scale is preserved.

The dEscore values for the images illustrated in fig. 1b and 1c are 0.7902 and 0.7817 respectively, which means that the visualization in fig. 1b is better than in fig. 1c. This result contradicts human perception and indicates a significant drawback of dEscore.

2. wEscore: window Escore modification

In the previous section, both extremes of the visualization estimation approach were criticized: the estimation based on the difference between the arbitrarily placed pixels and the estimation based on the difference between the neighboring pixels. Note that in the implementation provided in [18], the Escore combines both approaches. This may lead to some smoothing of the disadvantages inherent in both extremes. However, we assume that this is not enough and suggest modifying Escore so that both constituent parts, CCPR and CCFR, are focused on measuring local contrast at different scales.

Fig. 1. Examples of differences between the preservation and

loss of local contrast: (a) a fragment of the original color image; (b) a decolorized fragment with preservation of contrast between the neighboring pixels; (c) a decolorized fragment with the preservation of local contrast at the supra-pixel scale

To do this, we introduce the modified visualization quality measures as follows:

wCCPRWp (I, G) = wCCFRwF (I, G) =

#(9wf wp )

# FWP '

#(9WF ^rWF ) #9w. '

(5)

(6)

= {(p, q) : dist(p, q) < w, Ag (p, q) > k}, (7)

rw ={(p, q): dist(p, q) < w, Ac (p, q) > k}, (8)

wEscorewP ,wF (I, G) =

= 2 x wCCPRwP (I,G) x wCCFRwF (I,G) (9)

= wCCPRwr (I,G) + wCCFRwF (I, G) ,

where wp, wf - are window sizes within which CCPR and CCFR are calculated correspondingly, dist (p, q) - is Euclidean distance between two points on a plane. A new modified measure wEscore differs from Escore only by introducing the wP, wF constraints for ( p, q) pair generation. The advantage of the proposed wEscore measure is that by adjusting the window sizes, wEscore can estimate the error of visualization exactly at a scale compliant with that perceived by the human visual system at a given angular resolution of the observed images.

3. Window size adjustment

To adjust the window sizes, let us define an adjustment dataset and image markup. The adjustment dataset is a set of color images, each accompanied by a set of grayscale visualizations (decolorizations). The markup is performed as follows: for each visualization, the perceived contrast compared to the original image is either preserved or lost. Hence, our main criterion for the selection of color images

and their visualizations is the possibility of unambiguous organoleptic evaluation of preservation or loss of local contrasts for each visualization of each color image. Since the contrast sensitivity of the human visual system depends on the angular resolution, the conditions for image observation must be defined according to the current task and fixed at the adjustment stage.

For a given window size, wEscore can be calculated for each visualization and original image. This value depends, among other things, on the contrast threshold k. This dependence is not considered in this study: all wEscore values used further are calculated for k=5. This value is commonly suggested as a threshold for imper-ceivable contrast [18].

We propose an adjustment procedure to select the CCPR and CCFR window sizes so that the wEscore rankings closely match the reference ranking which corresponds to the markup. We will use the Kendall rank correlation [32] to estimate the match. Kendall's correlation is a rank correlation: the correlation value is calculated based on ranks, and not the numerical values. The values of the Kendall's correlation coefficient t belong to the interval [- 1; 1]. The value x = 1 corresponds to a monotonic increasing relationship between rankings, the value x = -1 corresponds to a decreasing one.

Let us denote the set of wEscore values as follows:

wEscoreWp ,wF = wEscorewP w

( M ),

(10)

where i = 1,...,N is the index of a color image in the adjustment dataset, N is the number of color images within the adjustment dataset, i = 1,. , Ni - is the grayscale visualization index, where Ni - is the number of annotated visualizations of a color image I. Let us denote the benchmark ranks as follows:

Markup' = Markup (, {G/ })e {0,1}'.

(11)

Now the window sizes adjustment can be formulated as the problem of the average Kendall's coefficient maximization over the window sizes:

1 N

N Y t(wEscoreW

, Markup' )-

->max.

(12)

4. Window size adjustment implementation

To test the proposed window size adjustment procedure, we collected an adjustment dataset consisted in eleven color images from the COLOR250 (250 color images) [18] and the Cadik (24 color images) datasets [29]. These datasets were chosen because they are employed in many decolorization works, and also because to each color image in these datasets, a set of pre-calculated grayscale images is attached. Examples of color images with corresponding sets of annotated visualizations are shown in fig. 2 - 7. markup is specified as red and green circles on grayscale images: red circles correspond to visualizations with per-

ceived lost contrasts, and green circles correspond to visualizations with perceived preserved contrasts.

The conditions for image observation during the markup process were as follows: the distance between the monitor and the observer is 60 cm, the display resolution is 2560 by 1140 pixels, the diagonal is 27 inches. These values correspond to the angular resolution of 115 pixels per degree. The brightness of the monitor was set to 100 cd m-2. Note that the viewing conditions of the images illustrated in this work can be different for a reader from those described above, which may lead to a discrepancy between the reader's perception and the markup.

Fig. 2. Fragment of the image "217.png" from the COLOR250 set; possible contrast loss near the lake above the letter "P": (a) color image; (b) visualization with perceived contrast loss; obtained by Lu2014 algorithm [18]; (c) visualization with perceived contrast preservation; obtained by Sokolov algorithm [10]

Fig. 3. Image "8.png" from the Cadik set; possible contrast loss around green ring segments: a) color image; b) visualization with perceived contrast loss; obtained by Lu2012 algorithm [19]; c) visualization with perceived contrast preservation; obtained by Sokolov algorithm [10]

In the previous section, we have formulated window sizes adjustment as optimization problem (12). To solve the latter, the brute-force search of wp, wF was performed in areas where we expect an optimum for the selected images and a given angular resolution. Given this, the loss of contrast between neighboring pixels (fig. 2b, fig. 3b) and pixels located at medium distance relative to the image size (fig. 6c) is distinguishable. The loss of contrast between pixels located at the opposite ends of the image under given conditions is indistinguishable. The average width of images in the adjustment set is 300 pixels. Let us consider two areas: Wi area of medium wp values and small wf values: Wj = {(wp, wf) : wp = 1:10:101, wf = 1:3:7} and W2 area of small wp values and medium wF values: W2 = {(wp, wf) : wp = 1:3:7, wf = 1:10:101}.

As a result of the brute-force search in Wiand W2, the maximum was found at wp = 61, wf = 7. At this point, the value of the average Kendall's coefficient, given by (12), was 0.7668. These window sizes are the result of the wEscore adjustment.

Fig. 8 shows the dependence of the averaged Kendall coefficient on wp at wf = 1 and wf=7. The range of wp values is chosen to cover the values corresponding to

dEscore at wp = 1, wf = 1 and Escore at wp = 600, wf = 1. The value wp = 600 was chosen because, as described above, in Escore, for calculating the CCPR, pairs are to be generated from pixels located randomly throughout the

image. The radius of the window wp = 600 was chosen so that in any pixel CCPR window completely covers any image of the adjustment dataset, the largest among which is 321 by 481 pixels.

_JB i

n

(a) QAHW^H (b) I I M ■ 11» I (c) LM^H^HJ (d)

Fig. 4. Image "123.png" from the COLOR250 set; possible contrast loss on the sleeves and helmet stripes: (a) color image; (b) visualization with perceived contrast loss; obtained by Lu2012 algorithm [19]; (c) visualization with perceived contrast preservation; obtained by Grundland algorithm [33]; (d) visualization with perceived contrast loss; obtained by Sokolov algorithm [10]

Fig. 5. Image "27.png" from the COLOR250 set; possible contrast loss between radial bands of different colors: (a) color image; (b) visualization with perceived contrast loss; obtained by Lu2014 algorithm [18]; (c) decolorization with perceived contrast loss; obtained by CIE Y channel selection; (d) visualization with perceived contrast preservation; obtained by Sokolov algorithm [10]

Fig. 6. Fragment of the image "196.png" from the COLOR250 set; possible contrast loss between flower petals and grass in the background: (a) color image; (b) visualization with perceived contrast loss; obtained by Grundland algorithm [33]; (c) visualization with perceived contrast loss; obtained by Sokolov algorithm [10]; d) visualization with perceived contrast

preservation; obtained by Lu2012 algorithm [19]

Fig. 7. Fragment of image "227. png" from the COLOR250 set; possible contrast loss between the sea and land areas: (a) color image; (b) visualization with perceived contrast preservation; obtained by Gooch algorithm [34]; (c) visualization with perceived

contrast preservation; obtained by CIE Y channel selection; (d) visualization with perceived contrast loss; obtained by Smith algorithm [35]; (e) visualization with perceived contrast preservation; obtained by Sokolov algorithm [10]; (f) visualization with perceived contrast preservation; obtained by Grundland algorithm [33]

5. Comparison of quality measures for visualization algorithms

In the previous sections, we proposed a new wEscore measure and described the adjustment procedure. This measure has two predecessors, Escore and dEscore, so the performance of these measures should be compared.

The quality measures will be compared using the data of the psychophysiological experiment published in [29]. The results of this experiment are a recognized standard and they have been already used to validate the Escore [18].

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

In this experiment, the images were shown to the observers who sat approximately 70 cm away from the

monitor in the native resolution of 1280 by 1024 pixels. Thus, the angular resolution of the observed images was 106 pixels per degree, which is close to the angular resolution of the images used for wEscore adjustment.

In this experiment, 7 decolorization algorithms were evaluated, among them 6 creator-owned ([33 - 38]), and the algorithm «convert the input image to color coordinates CIE XYZ and use the luminance channel CIE Y as the resulting image». Each algorithm was applied to a set of 24 color images. Thus, a set of 7 grayscale images corresponds to each color image.

Among these 24 color images, one was used in the adjustment. This is the 8.png image (fig. 3a). However, the

sets of visualizations used for the adjustment and in the psychophysiological experiment [29] do not intersect. Thus, color images and the sets of grayscale visualizations corresponding to them in [29] can be used to correctly compare the adjusted wEscore with other quality measures.

*. adjusted ( Average Kendall correlation ]

wEscore b"-0--0--0--0--0--0--0--0--0--Q

b-o-o-o-o-o

121 181 241 301 361 421 481 541 600 CCPR window size wP

Fig. 8. Average Kendall's coefficient values corresponding

to different combinations of CCPR and CCFR window sizes.

Combination wp = 1, wf = 1 corresponds to the dEscore,

combination wp = 61, wf = 7 corresponds to the adjusted wEscore, combination wp = 600, wf = 1 corresponds to the Escore. Contrast threshold: k = 5

The experiment was designed based on the 2AFC (two-alternative forced choice) approach: for each color image, each grayscale image was compared against each other in pairs. The number of comparison pairs generated in this way is too large to be demonstrated to a single observer for the latter to make a reasonable decision not compromised by fatigue. Therefore, each observer compared decolorizations for only 8 randomly selected color images out of 24.

The experiment included two types: accuracy experiment (choosing from two grayscale alternatives with the demonstration of the original color image) and preference experiment (without the demonstration of the original color image). Out of 119 observers, 60 were engaged in the accuracy experiment, and 59 engaged in the preference experiment. We will use only the results of the accuracy experiment since for MIV problem the preservation and distinguishability of the maximum amount of details present in the original image are more important than the pleasant experience from viewing the grayscale images without reference color image to compare to.

In [29], the results of pairwise comparisons gathered from all observers were converted into averaged reference ranks of decolorization algorithms for each color image using the law of comparative judgments [39]. Following [18], we compare the reference ranks with the automatic ranks generated from the quality measures under study via Kendall's correlation coefficient.

Kendall's coefficient averaged over all 24 images was 0.4167 for Escore, 0.3770 for dEscore, and 0.4881 for wEscore. Thus, the highest correlation between the meas-

ure and the reference ranks is achieved for the evaluation based on the adjusted wEscore measure.

Also, following [18], we conducted an analysis based on the observers' ranking agreement when comparing de-colorizations for each of the color images in the set (u-score) [39]. A high u-score corresponds to color image with decolorizations rated uniformly by all observers. A low u-score corresponds to color image with decoloriza-tions estimated differently by different observers.

From the color images of the original set, subsets of Ne {4,..., 24}images with maximum possible u-score were formed. The graphs in fig. 9 illustrate the averaged Kendall's coefficients for Escore, dEscore, and wEscore for each subset. The wEscore measure outperforms the results of the other measures for subsets of all sizes except sizes 4 and 6, where the results of Escore and wEscore are equal.

0.70 — 0.65 ■

( Average Kendall correlation ]

0.60 ■ V«. —o— Escore

—o— dEscore

0.55 ■■■♦■■■ wEscore

0.50

0.45

0.40

0.35 0.30

4 6 8 10 12 14 16 18 20 22 24 Size of the subset

Fig. 9. Average Kendall's coefficient values for Escore, dEscore, and wEscore, calculated for subsets of images with maximum u-score. Contrast threshold: k = 5

Conclusion

In this work, we propose wEscore, a new quality measure for visualization algorithms evaluation. The original Escore, and dEscore, its previously proposed modification, do not consider the dependence of the sensitivity of the human visual system to local contrasts on the angular resolution of the observed image. The wEscore measure, unlike previous versions, includes the parameters for the window sizes in which pairs of pixels are generated for contrast estimation. We proposed a procedure for window size adjustment: it can be used to obtain the optimal values of wEscore parameters for a given angular resolution. Using this procedure, we adjusted the wEscore parameters for the quality evaluation of the de-colorization algorithms with the angular resolution of 115 pixels per degree. The contrast threshold value for wEscore was set to 5. The optimal parameters were calculated: wp = 61, wf=7. We compared the adjusted wEscore with the Escore and the dEscore using the psy-chophysiological experiment data from [29]. The wEscore rankings of the decolorization algorithms were

the closest to the benchmark rankings based on the observers' perception.

References

[1] Zhizhin MN, Elwidge K, Poyda AA, Godunov AI, Ve-likhov VE, Erokhin GN, Alsynbaev KS, Bryksin VM. Using remote sensing data to monitor hydrocarbon production [In Russian]. Informatsionnye Tekhnologii i Vychslit-el'nye Sistemy 2014; 3: 97-111.

[2] ENVI - Image processing and analysis solution. Source: (https://www.ittvis.com/envi/).

[3] ERDAS Imagine. Source: (https://www.hexagongeospatial.c om/products/power-portfolio/erdas-imagine).

[4] Sarycheva A, Grigoryev A, Sidorchuk D, Vladimirov G, Khaitovich P, Efimova O, Gavrilenko O, Stekolshchikova E, Nikolaev E, Kostyukevich Y. Structure-preserving and perceptually consistent approach for visualization of mass spectrometry imaging datasets. Anal Chem 2020; 93(3): 1677-1685. DOI: 10.1021/acs.analchem.0c04256.

[5] Smets T, Verbeeck N, Claesen M, Asperger A, Griffioen G, Tousseyn T, Waelput W, Waelkens E, De Moor B. Evaluation of distance metrics and spatial autocorrelation in uniform manifold approximation and projection applied to mass spectrometry imaging data. Anal Chem 2019; 91(9): 5706-5714. DOI: 10.1021/acs.analchem.8b05827.

[6] Bratchenko IA, Alonova MV, Myakinin OO, Moryatov AA, Kozlov SV, Zakharov VP. Hyperspectral visualization of skin pathologies in visible region. Computer Optics 2016; 40(2): 240-248. DOI: 10.18287/2412-6179-2016-40-2-240-248.

[7] Ready P, Wintz P. Information extraction, SNR improvement, and data compression in multispectral imagery. IEEE Trans Commun 1973; 21(10): 1123-1131. DOI: 10.1109/TCOM.1973.1091550.

[8] Tyo JS, Konsolakis A, Diersen DI, Olsen RC. Principal-components-based display strategy for spectral imagery. IEEE Trans Geosci Remote Sens 2003; 41(3): 708-718. DOI: 10.1109/TGRS.2003.808879.

[9] Sidorchuk DS, Volkov VV, Nikonorov AV. Comparison of the nonlinear contrast-preserving visualization method for multispectral images with well-known decolorization algorithms [In Russian]. Information Processes 2020; 20(1): 41-54.

[10] Sokolov V, Nikolaev D, Karpenko S, Schaefer G. On contrast-preserving visualisation of multispectral datasets. International Symposium on Visual Computing 2010: 173-180.

[11] Socolinsky DA, Wolff LB. A new visualization paradigm for multispectral imagery and data fusion. Proc IEEE Computer Society Conference on Computer Vision and Pattern Recognition 1999: 319-324. DOI: 10.1109/CVPR.1999.786958.

[12] Sidorchuk DS, Konovalenko IA, Gladilin SA, Maksimov YI. Noise estimation for multispectral visualization. Sen-sornye Sistemy 2016; 30(4): 344-350.

[13] Sidorchuk DS, Volkov VV. Fusion of radar, visible and thermal imagery with account for differences in brightness and chromaticity perception. Sensornye Sistemy 2018; 32(1): 14-18. DOI: 10.7868/S0235009218010031.

[14] Zhang B, Yu X. Hyperspectral image visualization using t-distributed stochastic neighbor embedding. Proc SPIE 2015; 9815: 981504. DOI: 10.1117/12.2205840.

[15] Liao D, Qian Y, Zhou J. Visualization of hyperspectral imaging data based on manifold alignment. 22nd Int Conf on Pattern Recognition 2014: 70-75. DOI: 10.1109/ICPR.2014.22.

[16] Myasnikov EV. Nonlinear mapping methods with adjustable computational complexity for hyperspectral image analysis. Proc SPIE 2015; 9875: 987508. DOI: 10.1117/12.2228831.

[17] Myasnikov EV. Fast techniques for nonlinear mapping of hyperspectral data. Proc SPIE 2017; 10341: 103411D. DOI: 10.1117/12.2268707.

[18] Lu C, Xu L, Jia J. Contrast preserving decolorization with perception-based quality metrics. International journal of computer vision 2014; 110(2): 222-239.

[19] Lu C, Xu L, Jia J. Contrast preserving decolorization. IEEE Int Conf on Computational Photography 2012: 1-7. DOI: 10.1109/ICCPhot.2012.6215215.

[20] Hayes AE, Finlayson GD, Montagna R. RGB-NIR image fusion: metric and psychophysical experiments. Proc SPIE 2015; 9396: 93960U. DOI: 10.1117/12.2079224.

[21] Liao D, Chen S, Qian Y. Visualization of hyperspectral images using moving least squares. 24th Int Conf on Pattern Recognition 2018: 2851-2856. DOI: 10.1109/ICPR.2018.8546018.

[22] Coliban RM, Marincas M, Hatfaludi C, Ivanovici M. Linear and non-linear models for remotely-sensed hyperspectral image visualization. Remote Sens 2020; 12(15): 2479. DOI: 10.3390/rs12152479.

[23] Kang X, Duan P, Li S, Benediktsson JA. Decolorization-based hyperspectral image visualization. IEEE Trans Geosci Remote Sens 2018; 56(8): 4346-4360. DOI: 10.1109/TGRS.2018.2815588.

[24] Gabarda S, Cristobal G. Quality evaluation of blurred and noisy images through local entropy histograms. Proc SPIE 2007; 6592: 659214. DOI: 10.1117/12.721952.

[25] Tang R, Liu H, Wei J, Tang W. Supervised learning with convolutional neural networks for hyperspectral visualization. Remote Sens Lett 2020; 11(4): 363-372. DOI: 10.1080/2150704X.2020.1717014.

[26] Sowmya V, Govind D, Soman KP. Significance of incorporating chrominance information for effective color-to-grayscale image conversion. Signal Image Video Process 2017; 11(1): 129-136. DOI: 10.1007/s11760-016-0911-8.

[27] Zhao H, Zhang H, Jin X. Efficient image decolorization with a multimodal contrast-preserving measure. Comput Graph 2018; 70: 251-260. DOI: 10.1016/j.cag.2017.07.009.

[28] Zhang X, Liu S. Contrast preserving image decolorization combining global features and local semantic features. Vis Comput 2018; 34(6): 1099-1108.

[29] Cadik M. Perceptual evaluation of color-to-grayscale image conversions. Comput Graph Forum 2008; 27(7): 17451754. DOI: 10.1111/j.1467-8659.2008.01319.x.

[30] McLaren K. XIII-The development of the CIE 1976 (L* a* b*) uniform colour space and colour-difference formula. J Soc Dye Colour 1976; 92(9): 338-341. DOI: 10.1111/j.1478-4408.1976.tb03301.x.

[31] Van der Horst GJC, Bouman MA. Spatiotemporal chromaticity discrimination. J Opt Soc Am 1969; 59(11): 14821488. DOI: 10.1364/JOSA.59.001482.

[32] Kendall MG. Rank correlation methods. London: Charles Griffin and Co Ltd; 1948.

[33] Grundland M, Dodgson NA. Decolorize: Fast, contrast enhancing, color to grayscale conversion. Pattern Recognit 2007; 40(11): 2891-2896. DOI: 10.1016/j.patcog.2006.11.003.

[34] Gooch AA, Olsen SC, Tumblin JE, Gooch BS. Col-or2gray: salience-preserving color removal. ACM Trans Graph 2005; 24(3): 634-639. DOI: 10.1145/1073204.1073241.

[35] Smith K, Landes PE, Thollot J, Myszkowski K. Apparent greyscale: A simple and fast conversion to perceptually accurate images and video. Comput Graph Forum 2008; 27(2): 193-200. DOI: 10.1111/j.1467-8659.2008.01116.x.

[36] Bala R, Braun KM. Color-to-grayscale conversion to maintain discriminability. Proc SPIE 2003; 5293: 196-202. DOI: 10.1117/12.532192.

[37] Neumann L, Cadik M, Nemcsics A. An efficient perception-based adaptive color to gray transformation. Proc

Third Eurographics Conf on Computational Aesthetics in Graphics, Visualization and Imaging 2007: 73-80.

[38] Rasche K, Geist R, Westall J. Detail preserving reproduction of color images for monochromats and dichromats. IEEE Comput Graph Appl 2005; 25(3): 22-30. DOI: 10.1109/MCG.2005.54.

[39] Thurstone LL. A law of comparative judgment. Psychol Rev 1927; 34(4): 273-286. DOI: 10.1037/h0070288.

Author's information

Dmitry Sergeevich Sidorchuk, (b. 1992) graduated from Moscow Institute of Physics and Technology (MITP) in 2015, majoring in Applied Mathematics and Informatics. Currently works as researcher in Vision System Laboratory in Institute for Information Transmission Problems (IITP RAS). Research interests are visual systems and image processing. E-mail: ds-sidorchuk@yandex.ru .

Code of State Categories Scientific and Technical Information (in Russian - GRNTI)): 28.17.33 Received April 19, 2021. The final version - July 22, 2021.

i Надоели баннеры? Вы всегда можете отключить рекламу.