Abstract

A wide color gamut (WCG) display has great color rendering capability and offers the opportunity to achieve a pleasing and realistic appearance in terms of image quality. To take full advantage of the large display gamut, a new gamut extension algorithm (GEA) is proposed based on a new color appearance scale, vividness. The performance of the new GEA was investigated via a psychophysical experiment together with five commonly used GEAs. In addition, two different uniform color spaces (UCSs) were also studied including the CAM02-UCS color space and a space, Jzazbz, designed for high dynamic range (HDR) and WCG applications. The results showed that the newly proposed GEA, i.e. the vividness-extension (VE) algorithm, outperformed all the other GEAs and the Jzazbz space was a promising UCS for evaluating gamut extension.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

As described in the first part of this paper series [1], a large-scale project was conducted to study gamut mapping between sRGB space and a wide display gamut. This paper mainly focuses on gamut extension algorithms (GEAs) dedicated to mobile phone and display applications and forms the second part of the series.

Morovic stated [2] that a simple reverse version of a gamut compression algorithm (GCA) could be adopted to perform gamut extension. However, there is not much evidence that a direct inversion can give the best model performance since the case of extension is different from that of compression: the aim of a GEA is to provide reproductions that are both pleasant and natural, while a GCA is intended to give a satisfactory appearance reproduction against the original. Therefore, there is often a trade-off between fidelity and preference in the application of GEAs.

GEAs can be divided into two main categories: global GEAs and local GEAs. A global GEA is also called color-by-color mapping where any color in the source gamut will be mapped on to a single point in the destination gamut. Most research has concentrated on this category. For example, Sakurai et al. [3] examined the effect of GEAs on various expansion directions from the point-of-view of image preference. In their study, gamut mapping was performed from sRGB to a display gamut which was 50% larger. Their results showed that all the algorithms studied were effective and found that image preference correlated well with lightness contrast. Moreover, a moderate chroma enhancement (by a factor of 1.2-1.4) was found to give the best performance. Laird et al. [4] investigated several GEAs using two sets of gamut pairs, i.e. from a simulated small gamut to sRGB and from sRGB to a relatively large display gamut. Their results showed that the preferred algorithm had a dependency on image content especially for images that included skin tones. In addition, the algorithm that balanced between chroma and lightness modulations as a function of the input lightness consistently outperformed true color mapping (TCM), thus confirming the benefit of appropriate mapping for wide-gamut displays. The (TCM) method kept the tristimulus values of the reproduction unchanged from those of the original image. Ward et al. [5] presented a hybrid color mapping (HCM) algorithm designed to protect a fixed region in the 1976 uv diagram while exploiting the larger gamut of the target display. HCM was based on a hypothesis that outside a certain set of critical colors, observers were less particular and preferred a more saturated appearance if available. This actually indicates that natural (memory) colors, which are often in a low chroma region, should be better preserved while more saturated, unnatural colors can be extended to achieve a pleasing color appearance. Another algorithm based on a similar hypothesis was presented by Song et al. [6] who set a defined region to preserve skin color from distortion. Their results showed that the hybrid color mapping (HCM) algorithm was preferred to the TCM and same drive signal (SDS) methods on images containing natural tones with saturated colors. For the SDS method, the RGB values of the reproduction were the same as those of the source image. Both TCM and SDS methods were usually adopted as anchors to compare different GEAs.

In addition to color information, local GEAs consider the spatial information. Li et al. [7] proposed a multilevel gamut extension algorithm consisting of two sequential parts. In this algorithm, a global gamut mapping was first applied using a hue-varying extension function. Then, a local gamut mapping was performed using the human contrast sensitivity function [8] in order to avoid the over-contrast problem by retaining local contrast information. Zamir et al. [9] proposed a spatial GEA by modification of image contrast in CIELAB color space [10] to circumvent issues with their original GEA implemented in RGB space [11]. An image energy function [12] was adopted to modify image contrast by a contrast coefficientγ, i.e. a positive and a negative coefficient increases and decreases the image contrast, respectively. By increasing the image contrast, gamut extension could be achieved while keeping the perceived colors as close to the original as possible. In this algorithm, the source image in CIELAB space was considered to be a combination of three independent color channels (or images), i.e. L*, a* and b* channels (each could be considered as a 2D intensity image). The image energy function was applied to the a* and b* channels individually, while no modification was made to L* channel. This processing was repeated until a steady state was obtained for both a* and b* channels, and the image gamut was extended in each iteration. The degree of extension for the final output was controlled by the contrast coefficientγin the image energy function and, as a fact, it always exceeded the destination gamut to a certain threshold level (Te). A reverse version of the image energy function was then applied by adopting a negativeγto perform gamut compression. An inherent problem was that, in the sequential processing of the a* and b* channels, a hue-shift was inevitable due to the different degree of extension for the a* and b* channels. In addition, reproductions using this algorithm may appear to be over-saturated or under-saturated if the threshold level (Te) was not well defined.

From comparison between global and local GEAs, it is obvious that the former operates purely in the color domain and provide an easy and fast modification of the color gamut. Local GEAs however, are usually performed in the spatial domain or are combined with the global methods as a suppletory tool in improving image quality. It is generally believed that a spatial gamut compression algorithm (GCA) could provide a better reproduction than a global algorithm [13] because some of the detail that is clearly perceptible in the original image can sometimes be mapped such that it is no longer perceptible, or even lost, leading to image deterioration by some global GCAs. However, this is always avoidable since the difference between a color pair from the source gamut is usually enlarged when mapped onto a larger destination gamut by GEAs. This is quite different from the compression condition.

Based on above findings, it is clear that most GEAs were developed with the goal of increasing the chroma of a source color in a constant hue plane. This is in accordance with the custom to describe a typical three-dimensional color space such as CIELAB using lightness, hue and chroma. However, 'chroma' was found to be somehow difficult for a color naïve observer to understand and led to an inconsistent result in color assessment experiments [14]. Hence, Berns [15] proposed three new scales for CIELAB color space, i.e. vividness, depth and clarity, to better describe human perceptions. Each of these scales represents a Euclidean distance from a color to the white for depth, to the black for vividness and to its background for clarity. Figure 1 shows the definition of vividness in a constant hue plane. Similarly, Cho et al. [16-17] developed four scales, i.e. saturation, vividness, blackness, and whiteness as new color appearance scales, and found, via a psychophysical experiment, that their saturation had a close agreement with the Berns depth scale, and blackness had a strong negative correlation with the Berns vividness scale. They then concluded that changes in these variables were more representative of our daily experience of color description. Hence, it is possible that these more easily perceived scales could lead to a better description of color appearance. In Part I of this paper series [1], the vividness scale was found to be a good indicator to represent image quality and human perception in the field of gamut compression. Therefore, it was again adopted to develop a new GEA.

 figure: Fig. 1

Fig. 1 Illustration of vividness direction. The points in the vectors away from black were said to have different vividness values.

Download Full Size | PPT Slide | PDF

With the above in mind, a new GEA is proposed that uses the vividness scale which is alternative to chroma used by many other algorithms. The performance of this new GEA is compared with that of the others including both spatial and global algorithms. In addition, a newly proposed UCS for high dynamic range (HDR) and wide color gamut (WCG) applications, Jzazbz together with the CIE recommended CAM02-UCS, via a psychophysical experiment. The results were analyzed to verify the effectiveness of vividness scale.

2. Development of the new GEA

From our previous work [1], the method based on a vividness scale was verified to be an effective scale for gamut compression. Therefore, the mapping direction was selected based on this idea. The new vividness-extension GEA (referred as VE), includes three steps:

Step 1: mapping towards the lightness axis

The first step is to map colors towards the lightness axis. When a line is drawn from the source color to the black of destination gamut, it could find more than one intersection with the lower boundary (the boundary from the gamut black to the cusp, which is defined as the color of maximum chroma in a given hue plane). This is caused by the lack of smoothness of the lower boundary. Hence, Step 1 is used to provide a smooth and straight lower boundary.

Figure 2 illustrates a mapping of colors having a smaller lightness value than that of the cusp towards the lightness axis. In detail, colors near the lower boundary of the source gamut are first compressed on to a constructed boundary. Figure 2 shows the newly constructed boundary, plotted as a dash-dotted line, which is obtained by connecting the black point and the cusp in the source gamut. Equation (1) performs this task.

 figure: Fig. 2

Fig. 2 Mapping towards the lightness axis. P is the source color andPis the mapped color after Step 1.PbandPb'are the intersection of lineEPwith the source and the newly constructed gamuts, respectively. E is the mapping center on the lightness axis that has the same lightness value as point P.

Download Full Size | PPT Slide | PDF

EP'¯={EP¯;EP¯0.6×EPb¯0.6×EPb'¯+EP¯0.6×EPb'¯EPb¯0.6×EPb'¯×0.4×EPb'¯;EP¯>0.6×EPb¯

Note Eq. (1) is only used when the lower boundary has a convex shape. It was taken from the SGCK algorithm [18] except that the constant 0.6 was optimized to give the best fit to the present results.

Overall, this step provides a smooth and straight lower boundary, and thus simplifies the mathematical calculation in Step 2.

Step 2: vividness extension

Step 2 is the most important step in the algorithm. Three requirements are given below:

  • 1) The source black, i.e., the black of the source gamut, is mapped onto the destination black, i.e. the black of the destination gamut. And the source white is mapped to the destination white. This ensures no contrast inversion.
  • 2) Most of the colors should be extended to have a larger vividness value so as to result in better image quality. However, this is not applicable to dark colors, or colors that have relatively small vividness values, which otherwise would lead to image contrast reduction.
  • 3) The vividness extension should be smoothly applied to avoid any color discontinuity, especially when the source color is near the source cusp.

Figure 3 illustrates the extension of the source color in the vividness direction. This non-linear vividness extension follows the equation:

EP¯=EPhh¯×(EP¯EPh¯k×v×μ)
v=EPb¯EPw¯,k=PPh¯PlPh¯andμ=1CPCCUSP
wherePbandPware the black and white in the source gamut. The value v is the off-set to ensure a black-to-black mapping, i.e., the source black (Pb) is mapped to the destination black (E). However, this will lead to a problem in that the source white (Pw) will no longer be mapped onto the destination white due to the use of this off-set, v. A scaling factor k is then introduced to ensure a successful white-to-white mapping. This factor is defined as the length of PPh¯divided by the total length ofPlPh¯, indicating that k is set to zero for colors in the upper boundary. CP and CCUSP are the chroma values of P' and the cusp of the newly constructed gamut. The factorμis adopted for colors near the cusp to map them on to the upper boundary of the destination gamut.

 figure: Fig. 3

Fig. 3 Illustration of vividness extension. The source color P is mapped to P along the vividness direction. E is the focal point (black) in the destination gamut.Plis the intersection ofEPwith the lower source boundary.Phis the intersection ofEPwith the higher source boundary.Phhis the intersection ofEPwith the higher destination boundary.

Download Full Size | PPT Slide | PDF

After Step 2, the upper boundary of the source gamut is mapped to that of the destination gamut and the source black is mapped on to the destination black. However, the lower boundary of the source gamut is not mapped to that of the destination gamut. This is clearly shown in Fig. 4.

 figure: Fig. 4

Fig. 4 Illustration of chroma extension. E is the mapping center on the lightness axis that has the same lightness value asP.Plis the intersection ofEPwith the source gamut after Step 2 andPdis the intersection ofEPwith the destination boundary.Poutis the final output.

Download Full Size | PPT Slide | PDF

Step 3: Chroma extension

Step 2 extends the vividness of the source colors. However, a portion of the destination gamut is not fully utilized in the lower region where the lightness value is smaller than the source cusp after Step 2. Therefore, Eq. (4) is adopted to perform a chroma extension for colors outside the 60% region of the source gamut and Fig. 4 illustrates this process.

EPout¯={EP¯;EP¯0.6×EPl¯0.6×EPl¯+EP¯0.6×EPl¯0.4×EPl¯×(EPd¯0.6×EPl¯);EP¯>0.6×EPl¯

Equation (4) is an inverse version of Eq. (1), which extends the chroma for the source colors. An essential requirement here is that the chroma extension should not result in any color dis-continuity caused by the enlarged chroma difference. Hence, a 60% knee function is applied to make a smooth chroma extension. This is totally different from the case of gamut compression where the color difference would always be smaller after the compression.

3. Experimental

Several standards have been specially developed for wide color gamut (WCG) applications including DCI-P3 [19], Rec. 2020 [20], Rec. 2100 [21] etc. DCI-P3 is possibly the most widely adopted standard in both the TV and mobile phone industries and is a promising replacement for sRGB [22]. sRGB is the most commonly used RGB space and has also been the de facto standard during the last two decades in nearly all fields in the imaging industry.

In this study, an NEC PA302W display (30-inch LCD) was used to investigate gamut extension. An sRGB gamut simulated using the display was adopted as the source gamut and the display itself was designated as the destination gamut. As is shown in Fig. 5, the display gamut was much wider, covering the whole sRGB gamut and was approximately 11% larger than DCI-P3 gamut in the 1976 u'v' diagram. Different hue planes for the source gamut (sRGB) and the destination gamut (display) in CIELAB color space are illustrated in Fig. 6. As is shown, all colors in the sRGB gamut were achievable on the display.

 figure: Fig. 5

Fig. 5 The comparison of the sRGB, DCI-P3 and the display gamuts in the CIE 1976 u'v' diagram.

Download Full Size | PPT Slide | PDF

 figure: Fig. 6

Fig. 6 Different hue planes for the source gamut (display gamut) and the destination gamut (sRGB), as well as a vertical view of both gamuts.

Download Full Size | PPT Slide | PDF

3.1 Psychophysical experiment

The whole experiment was carried out following the evaluation guidelines provided by CIE Publication 159 [18]. The display was located in a darkened room with a wall reflectance of approximately 4%. Its peak luminance was set at 110 cd/m2 with CIE Illuminant D65 chromaticity. The spatial uniformity was evaluated by dividing the display into 3 by 3 segments and the mean color difference calculated between the center and each segment was 1.21ΔEab*. The repeatability was 0.92ΔEab*averaged from measuring the display white over a period of 6 hours at about every 20-second interval. The GOG model [23] was implemented for display characterization and gave a performance of 0.64ΔEab*with a range from 0.37ΔEab*to 1.66ΔEab*, calculated from the 24 colors of the Macbeth ColorChecker chart.

Six extension GEAs were included in this study: the Zamir spatial GEA [9], the Ward et al. hybrid color mapping (HCM) [5], the Laird et al. high chroma boost (HCB) algorithm [4], a same driving signal (SDS) method, i.e. the RGB values of each pixel in the reproduction were the same as those in the source image, a true color mapping (TCM) method, i.e. the XYZ values of each pixel in the reproduction were the same as those in the source image, and our proposed VE method. All these algorithms were introduced in Section 1 and they gave good performance as shown in their own evaluation. Note that, the HCB method applied in 1976 u’v’ chromaticity diagram, and TCM and SDS were implemented without a uniform color space.

Two UCSs, i.e. the most uniform color space, CAM02-UCS [24] and a newly proposed UCS, Jzazbz [25] were investigated. CAM02-UCS is an elaborated UCS adjusted from parameters of the CIE endorsed CIECAM02 color appearance model [26]. The Jzazbz UCS is designed for displaying HDR and WCG contents and is a trade-off between uniformity and hue linearity. It is expected to give a good performance in image processing applications. Both these UCSs were included to test their performance in the gamut extension applications.

Figure 7 shows the test images included in this study. They include different contents, covering some typical scenes for display applications. Memory colors [27] including skin, grass, blue sky, fruits, and sea were investigated in Image 7(2), 7(3), 7(4) and 7(6). They are important in image rendering for achieving a “lifelike” appearance under a range of practical viewing conditions as judged in terms of color and image quality. Building colors were investigated in Image 7(1) and colors with high chroma were investigated in Image 7(5). Some images were taken in the open air such as Image 7(4), 7(5) and 7(6). They were aimed to test the change of dynamic range when processed using a GEA. Image 7(7) was a still image from a popular computer game and was adopted to test the performance of GEAs for their high chromatic colors. Image 7(1) was repeated to reveal intra-observer variability. Thus, in total, there were 6 GEAs (Zamir, HCM, HCB, TCM, SDS and the present VE method), 2 color spaces (CAM02-UCS, Jzazbz) and 8 images (seven images plus one repeated image) in this study. Since the choice of color space was not applicable to the HCM, TCM and SDS methods, there were 9 GEA-UCS pairs, i.e. (3 (GEAs) ×2 (UCSs) + 3 (GEAs)) and 72 images, i.e. 9 (GEA-UCS pairs)×8 (images), generated in total. Each image included a 5% white border to ensure a fixed adapting white point.

 figure: Fig. 7

Fig. 7 The seven test images: (1) Building, (2) Lover, (3) Fruit, (4) Bay, (5) Garden (6) Spring, and (7) Animation. Image 7(1) was repeated.

Download Full Size | PPT Slide | PDF

Eighteen observers, ten males and eight females, with a mean age of 24.8 years and a standard deviation of 3.5 years, participated this experiment. Figure 8 shows the experimental set-up with two reproductions side-by-side. The background was set at middle grey with an L*a*b* value of [50 0 0], which was first calculated using the GOG model and further refined using the measured result. Observers sat in a distance of 70 cm to the display, so the field of view was approximately70°. Observers were asked to choose which reproduction was preferred. Each observer made C92=9×82=36pair-wise comparisons per image. In total, 5184 comparisons, i.e., 36 (comparisons) ×8 (images) ×18 (observers), were made.

 figure: Fig. 8

Fig. 8 The experimental setup.

Download Full Size | PPT Slide | PDF

4. Results and discussion

4.1 Inter- and intra-observer variability

Wrong decision (WD) [28] was adopted to reveal both inter-and intra-observer variability. For intra-observer variability, wrong decision was defined as the number of total wrong decisions divided by the total number of choices. A wrong decision was defined as being two repeated assessments that disagreed with each other. For example, the first answer might be ‘yes’ and the second answer ‘no’. For inter-observer variation, a wrong decision was calculated as the wrong percentage averaged from all combination pairs. Wrong percentage was interpreted as the possibility to make a wrong decision, i.e. the number of minority choices divided by the total number of choices. A lower WD indicates better observer performance.

Table 1 summaries both intra- and inter-observer variabilities and the mean values were 25 and 32, respectively. These values were slightly higher than those that derived from the Part I gamut compression experiment [1], where they were 17 and 25 respectively. In the compression experiment, observers were asked to choose which reproduction was more similar to the original and an original image was presented in the middle with two reproductions on each side. This higher variability can be considered reasonable since there was no reference presented in this experiment and 'preference' was a more personal concept than ' fidelity' and may vary from person to person.

Tables Icon

Table 1. The % Wrong Decisions representing the intra-observer and inter-observer variability.

4.2 Result of the psychophysical experiment

Z-score values were calculated following Case V of Thurstone’s Law of Comparative Judgment [29]. The results were reported in terms of z-scores (unit normal deviates) by referring to the area under the normal distribution curve to rank all the GEA-UCS pairs studied. A higher number means a better GEA-UCS pair performance.

The standard deviation of the z-score values was assumed to be σ = 1/√2 and the 95% confidence interval (CI) of a z-score value A can therefore be calculated as Eq. (5):

CI=A±1.96σN=A±1.96N*2

Figure 9 summarizes the overall results in terms of model performance for both GEAs and UCSs. It was clear that the VE algorithm outperformed all the other GEAs tested, indicating that vividness was an effective color appearance scale when used in gamut extension. A higher vividness value led to a higher image preference, implying a simultaneous lightness and chroma extension could give a more preferred reproduction. In other words, preference was not related to only chroma; for colors in a low chroma region, a boost in lightness was better than a boost in chroma.

 figure: Fig. 9

Fig. 9 The overall result of GEA experiment: z-score is plotted for each GEA and for each UCS.

Download Full Size | PPT Slide | PDF

Both Table 2 and Fig. 10 reported the performance of each GEA and UCS combination. It is clearly depicted that the VE algorithm always had values above zero except for Image 7(6) when combined with CAM02-UCS, indicating that VE outperformed SDS consistently. Note that SDS represents the baseline. The results showed the superiority of the VE algorithm and it should be safely applied to all the types of image without worrying image quality deterioration. The other GEAs however, do not provide such consistency, i.e., the performance from the other algorithms depends very much on image content.

Tables Icon

Table 2. The ranking of GEA and UCS performance based on 7 images (1 = best, 9 = worst, C = CAM02-UCS, J = Jzazbz)

 figure: Fig. 10

Fig. 10 The performance of each GEA and UCS combination for each individual image. The results are scaled such that SDS (same driving signal) has a value of 0 and, hence, results for that algorithm are not shown.

Download Full Size | PPT Slide | PDF

Among all the GEAs investigated, the same driving signal (SDS) method seems to be most controversial. It was assumed that SDS fully utilized the entire destination gamut and sometimes gave an appealing image impression due to its high chroma enhancement. However, the hues were shifted substantially, especially an image containing blue sky. Some observers thought such a hue shift to be artistic and attractive while others considered it to be unnatural. This was similar to the results using the HCM method, which served as a compromise between true color mapping (TCM) and SDS. Hence, its performance was a trade-off between these two algorithms. In this study, it was found that both the SDS and the HCM methods performed well for outdoor images and animations such as Images 7(1) (Building), 7(5) (Garden), 7(6) (Spring) and 7(7) (Animation), which included objects such as flowers, walls, and some unnatural colors like cartoons. However, for memory colors, such as vegetables, skin, sky and grass, TCM was preferred over the HCM and SDS methods. This was quite obvious in Images 7(3) (Fruit).

The Zamir et al. method was a spatial GEA and aimed to enlarge the perceptual image contrast. However, some obvious defects were observed including halos in regions where color gradients were large, as illustrated in Fig. 11, and a slight hue shift in the blue region.

 figure: Fig. 11

Fig. 11 Halos found in areas adjacent to the roof. Left is the original image and right is the reproduction processed with the Zamir et al. algorithm.

Download Full Size | PPT Slide | PDF

The HCB method shared a similar goal; to preserve memory colors in low chroma regions while giving a boost to colors that have high chroma values. However, it was somehow difficult to define a particular memory color region. For example, red flowers had a similar color as that of a tomato, but the flowers needed a chroma boost while the tomato was better reproduced by preserving the chroma.

The proposed VE method adopted the concept of vividness, which was proved to comply with human perceptions of color appearance. It outperformed all the other GEAs studied for nearly all kinds of image contents. The only exception was Image 7(3) (Fruit) which contained many memory colors. In that image, the Zamir et al. GEA performed the best for its limited chroma enhancement. The VE algorithm was ranked second since its vividness boost was slightly greater that the observers’ expectation.

For different UCSs, the newly proposed Jzazbz space gave a slightly better performance than the CAM02-UCS space, suggesting that hue linearity was important in the field of gamut extension. However, the difference between these spaces was not significant. Note that this result was not the same as that obtained from the results of the compression experiment in Part I of this paper series, where no single UCS was better than any other. However, it was reasonable since the degree of hue shift was related to chroma: higher chroma led to greater hue shift. Hence, it would be easier to find a hue shift in the GEA experiment than in the GCA experiment, implying a UCS compromising between hue-linearity and uniformity was more preferred when applied to gamut extension.

4.3 Image dependency

As is introduced in Part I of this paper series [1], one of the major limitations of current gamut mapping algorithms is that their performance is not only dependent on the source and destination gamuts, but also dependent on image content. This indicates that even if one algorithm performs the best for some images, it still may fail badly for others. Hence, a good algorithm should have a consistent performance, e.g. in terms of z-scores, among different test images. This is the so-called effect of image dependency.

Figure 12 demonstrates the image dependency in terms of the different GEAs tested. The image dependency was evaluated using the standard deviation of the z-score values for the different images investigated. A smaller image dependency indicates a more consistent model performance when applied to images having different contents.

 figure: Fig. 12

Fig. 12 Image dependency for different GEAs and UCSs.

Download Full Size | PPT Slide | PDF

It can be seen that the image dependency was highly related with the GEAs but had little relation with different UCSs. HCB gave the least image dependency, followed by the Zamir et al. algorithm and the present VE method. The TCM and SDS methods showed the largest image dependency, indicating the performance of these two algorithms was varied considerably with image contents. This was in accordance with our former results, where SDS and TCM were reported as most controversial. Jzazbz and CAM02-UCS showed a similar image dependency for all the GEAs tested.

The image dependency results for the GEA experiment were relatively smaller than those for the GCA experiment, e.g. around 0.24 for VE (extension) and around 0.38 for VP (compression) respectively, indicating a GEA performing well on an image was more likely to have a good performance on another image having different image contents. Thus it was possible for a GEA to be universally acceptable. However, GCAs had a larger image dependency, indicating a larger possibility for a GCA to give ambiguous performance for images having different contents.

5. Conclusion

This part of the paper series describes a method to test and develop gamut extension algorithms. Initially, five GEAs were implemented. A gamut extension algorithm, VE, was successfully developed based on the new color appearance scale, vividness. It was compared with the five other GEAs and was found to give an improved performance over all the GEAs tested. In addition, the newly proposed UCS, Jzazbz was investigated together with the CAM02-UCS and its performance was consistently better, indicating hue linearity was an important aspect in gamut mapping.

From the outcome of this paper series, we have found that a vividness scale is an effective appearance scale for both gamut compression and extension and is of interest for image processing applications.

References and links

1. L. Xu, B. Zhao, and M. R. Luo, “Color Gamut Mapping: PART I. Gamut Compression,” Opt. Express 26(9), 11481–11495 (2018). [CrossRef]   [PubMed]  

2. J. Morovic, Color Gamut Mapping (John Wiley & Sons, 2008).

3. M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009. [CrossRef]  

4. J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010). [CrossRef]  

5. G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

6. G. Song, “Skin Color Region Protect Algorithm for Color Gamut Extension,” J. Inf. Comput. Sci. 11(6), 1909–1916 (2014). [CrossRef]  

7. Y. Li, G. Song, and H. Li, “A multilevel gamut extension method for Wide Gamut Displays,” in Proceedings of Electric Information and Control Engineering (2011), pp. 1035–1038.

8. S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010). [CrossRef]  

9. S. W. Zamir, J. Vazquezcorral, and M. Bertalmío, “Gamut extension for cinema: psychophysical evaluation of the state of the art and a new algorithm,” in proceedings of Human Vision and Electronic Imaging (2015), p. 93940U.

10. Commission Internationale de l’Éclairage (CIE), “Recommendations on Uniform Color Spaces, Color Difference Equations, Psychometrics Color Terms,” in Supplement No. 2 of CIE Publication No. 15 (1971).

11. S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014). [CrossRef]  

12. M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007). [CrossRef]   [PubMed]  

13. P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007). [CrossRef]   [PubMed]  

14. M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010). [CrossRef]  

15. R. S. Berns, “Extending CIELAB: Vividness,Vab*Dab*Tab*,” Color Res. Appl. 39(4), 322–330 (2014). [CrossRef]  

16. Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016). [CrossRef]  

17. Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017). [CrossRef]  

18. Commission Internationale de l’Éclairage (CIE), “Guidelines for the Evaluation of Gamut Mapping Algorithms,” in CIE Publication No.156 (2003).

19. Society of Motion Picture and Television Engineers (SMPTE), “Digital Cinema Quality - Reference Projector and Environment,” in SMPTE RP 431–2 (2011).

20. International Telecommunication Union (ITU), “Parameter values for ultra-high definition television systems for production and international programme exchange,” in ITU-R Recommendation BT.2020 (2012).

21. International Telecommunication Union (ITU), “Image parameter values for high dynamic range television for use in production and international programme exchange,” in ITU-R Recommendation BT. 2100 (2017).

22. International Electrotechnical Commission (IEC), “Multimedia Systems and Equipment-Color Measurement and Management-Part 2-1: Color Management-Default RGB Color Space-sRGB,” in IEC 61966–4 (1999).

23. R. S. Berns, “Methods for Characterizing CRT Displays,” Displays 16(4), 173–182 (1996). [CrossRef]  

24. M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006). [CrossRef]  

25. M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “Perceptually Uniform Color Space for Image Signals Including High Dynamic Range and Wide Gamut,” Opt. Express 25(13), 15131–15151 (2017). [CrossRef]   [PubMed]  

26. N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

27. C. J. Bartleson, “Memory colors of familiar objects,” J. Opt. Soc. Am. 50(1), 73–77 (1960). [CrossRef]   [PubMed]  

28. K. McLaren, “An Introduction to Instrumental Shade Passing and Sorting and a Review of Recent Developments,” Color. Technol. 92(9), 317–326 (2010).

29. L. L. Thurstone, “A Law of Comparative Judgment,” Psychol. Rev. 34(4), 273–286 (1927). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. L. Xu, B. Zhao, and M. R. Luo, “Color Gamut Mapping: PART I. Gamut Compression,” Opt. Express 26(9), 11481–11495 (2018).
    [Crossref] [PubMed]
  2. J. Morovic, Color Gamut Mapping (John Wiley & Sons, 2008).
  3. M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
    [Crossref]
  4. J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
    [Crossref]
  5. G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.
  6. G. Song, “Skin Color Region Protect Algorithm for Color Gamut Extension,” J. Inf. Comput. Sci. 11(6), 1909–1916 (2014).
    [Crossref]
  7. Y. Li, G. Song, and H. Li, “A multilevel gamut extension method for Wide Gamut Displays,” in Proceedings of Electric Information and Control Engineering (2011), pp. 1035–1038.
  8. S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
    [Crossref]
  9. S. W. Zamir, J. Vazquezcorral, and M. Bertalmío, “Gamut extension for cinema: psychophysical evaluation of the state of the art and a new algorithm,” in proceedings of Human Vision and Electronic Imaging (2015), p. 93940U.
  10. Commission Internationale de l’Éclairage (CIE), “Recommendations on Uniform Color Spaces, Color Difference Equations, Psychometrics Color Terms,” in Supplement No. 2 of CIE Publication No. 15 (1971).
  11. S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
    [Crossref]
  12. M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
    [Crossref] [PubMed]
  13. P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007).
    [Crossref] [PubMed]
  14. M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
    [Crossref]
  15. R. S. Berns, “Extending CIELAB: Vividness,Vab*Dab*Tab*,” Color Res. Appl. 39(4), 322–330 (2014).
    [Crossref]
  16. Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
    [Crossref]
  17. Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
    [Crossref]
  18. Commission Internationale de l’Éclairage (CIE), “Guidelines for the Evaluation of Gamut Mapping Algorithms,” in CIE Publication No.156 (2003).
  19. Society of Motion Picture and Television Engineers (SMPTE), “Digital Cinema Quality - Reference Projector and Environment,” in SMPTE RP 431–2 (2011).
  20. International Telecommunication Union (ITU), “Parameter values for ultra-high definition television systems for production and international programme exchange,” in ITU-R Recommendation BT.2020 (2012).
  21. International Telecommunication Union (ITU), “Image parameter values for high dynamic range television for use in production and international programme exchange,” in ITU-R Recommendation BT. 2100 (2017).
  22. International Electrotechnical Commission (IEC), “Multimedia Systems and Equipment-Color Measurement and Management-Part 2-1: Color Management-Default RGB Color Space-sRGB,” in IEC 61966–4 (1999).
  23. R. S. Berns, “Methods for Characterizing CRT Displays,” Displays 16(4), 173–182 (1996).
    [Crossref]
  24. M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
    [Crossref]
  25. M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “Perceptually Uniform Color Space for Image Signals Including High Dynamic Range and Wide Gamut,” Opt. Express 25(13), 15131–15151 (2017).
    [Crossref] [PubMed]
  26. N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.
  27. C. J. Bartleson, “Memory colors of familiar objects,” J. Opt. Soc. Am. 50(1), 73–77 (1960).
    [Crossref] [PubMed]
  28. K. McLaren, “An Introduction to Instrumental Shade Passing and Sorting and a Review of Recent Developments,” Color. Technol. 92(9), 317–326 (2010).
  29. L. L. Thurstone, “A Law of Comparative Judgment,” Psychol. Rev. 34(4), 273–286 (1927).
    [Crossref]

2018 (1)

2017 (2)

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “Perceptually Uniform Color Space for Image Signals Including High Dynamic Range and Wide Gamut,” Opt. Express 25(13), 15131–15151 (2017).
[Crossref] [PubMed]

2016 (1)

Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
[Crossref]

2014 (3)

G. Song, “Skin Color Region Protect Algorithm for Color Gamut Extension,” J. Inf. Comput. Sci. 11(6), 1909–1916 (2014).
[Crossref]

S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
[Crossref]

R. S. Berns, “Extending CIELAB: Vividness,Vab*Dab*Tab*,” Color Res. Appl. 39(4), 322–330 (2014).
[Crossref]

2010 (4)

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
[Crossref]

J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
[Crossref]

K. McLaren, “An Introduction to Instrumental Shade Passing and Sorting and a Review of Recent Developments,” Color. Technol. 92(9), 317–326 (2010).

2007 (2)

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007).
[Crossref] [PubMed]

2006 (1)

M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
[Crossref]

1996 (1)

R. S. Berns, “Methods for Characterizing CRT Displays,” Displays 16(4), 173–182 (1996).
[Crossref]

1960 (1)

1927 (1)

L. L. Thurstone, “A Law of Comparative Judgment,” Psychol. Rev. 34(4), 273–286 (1927).
[Crossref]

Akhavan, T.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

Bartleson, C. J.

Berns, R. S.

R. S. Berns, “Extending CIELAB: Vividness,Vab*Dab*Tab*,” Color Res. Appl. 39(4), 322–330 (2014).
[Crossref]

R. S. Berns, “Methods for Characterizing CRT Displays,” Displays 16(4), 173–182 (1996).
[Crossref]

Bertalmio, M.

S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
[Crossref]

Bertalmío, M.

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

Caselles, V.

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

Cho, Y. J.

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
[Crossref]

Clarke, A. A.

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

Cui, G.

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “Perceptually Uniform Color Space for Image Signals Including High Dynamic Range and Wide Gamut,” Opt. Express 25(13), 15131–15151 (2017).
[Crossref] [PubMed]

M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
[Crossref]

Fairchild, M. D.

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

Hatanaka, S.

S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
[Crossref]

Heckaman, R. L.

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

Hunt, R. W. G.

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

Kim, Y. J.

Kuang, J.

J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
[Crossref]

Laird, J.

J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
[Crossref]

Li, C.

M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
[Crossref]

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

Luo, M. R.

L. Xu, B. Zhao, and M. R. Luo, “Color Gamut Mapping: PART I. Gamut Compression,” Opt. Express 26(9), 11481–11495 (2018).
[Crossref] [PubMed]

M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “Perceptually Uniform Color Space for Image Signals Including High Dynamic Range and Wide Gamut,” Opt. Express 25(13), 15131–15151 (2017).
[Crossref] [PubMed]

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
[Crossref]

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

Luo, R.

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
[Crossref]

McLaren, K.

K. McLaren, “An Introduction to Instrumental Shade Passing and Sorting and a Review of Recent Developments,” Color. Technol. 92(9), 317–326 (2010).

Moroney, N.

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

Muijs, R.

J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
[Crossref]

Nakatsue, T.

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

Nakauchi, S.

S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
[Crossref]

Newman, T.

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

Ou, L. C.

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
[Crossref]

Provenzi, E.

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

Rhodes, P. A.

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

Rizzi, A.

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

Safdar, M.

Sakurai, M.

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

Schappo, A.

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

Scrivener, S. A. R.

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

Shimpuku, Y.

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

Simon, K.

P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007).
[Crossref] [PubMed]

Song, G.

G. Song, “Skin Color Region Protect Algorithm for Color Gamut Extension,” J. Inf. Comput. Sci. 11(6), 1909–1916 (2014).
[Crossref]

Soudi, A.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

Tait, C. J.

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

Thurstone, L. L.

L. L. Thurstone, “A Law of Comparative Judgment,” Psychol. Rev. 34(4), 273–286 (1927).
[Crossref]

Usui, S.

S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
[Crossref]

Vazquez-Corral, J.

S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
[Crossref]

Ward, G.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

Xu, L.

Yoo, H.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

Zamir, S. W.

S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
[Crossref]

Zhao, B.

Zolliker, P.

P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007).
[Crossref] [PubMed]

Color Res. Appl. (7)

J. Laird, R. Muijs, and J. Kuang, “Development and evaluation of gamut extension algorithms,” Color Res. Appl. 34(6), 443–451 (2010).
[Crossref]

S. Nakauchi, S. Hatanaka, and S. Usui, “Color gamut mapping based on a perceptual image difference measure,” Color Res. Appl. 24(4), 280–291 (2010).
[Crossref]

M. R. Luo, A. A. Clarke, P. A. Rhodes, A. Schappo, S. A. R. Scrivener, and C. J. Tait, “Quantifying color appearance. Part I. Lutchi color appearance data,” Color Res. Appl. 16(3), 166–180 (2010).
[Crossref]

R. S. Berns, “Extending CIELAB: Vividness,Vab*Dab*Tab*,” Color Res. Appl. 39(4), 322–330 (2014).
[Crossref]

Y. J. Cho, L. C. Ou, and R. Luo, “A Cross-Cultural Comparison of Saturation, Vividness, Blackness and Whiteness Scales,” Color Res. Appl. 42(2), 203–215 (2016).
[Crossref]

Y. J. Cho, L. C. Ou, G. Cui, and R. Luo, “New Color Appearance Scales for Describing Saturation, Vividness, Blackness, and Whiteness,” Color Res. Appl. 42(5), 552–563 (2017).
[Crossref]

M. R. Luo, G. Cui, and C. Li, “Uniform Color Spaces Based on CIECAM02 Color Appearance Model,” Color Res. Appl. 31(4), 320–330 (2006).
[Crossref]

Color. Technol. (1)

K. McLaren, “An Introduction to Instrumental Shade Passing and Sorting and a Review of Recent Developments,” Color. Technol. 92(9), 317–326 (2010).

Displays (1)

R. S. Berns, “Methods for Characterizing CRT Displays,” Displays 16(4), 173–182 (1996).
[Crossref]

IEEE J. Sel. Top. Signal Process. (1)

S. W. Zamir, J. Vazquez-Corral, and M. Bertalmio, “Gamut Mapping in Cinematography Through Perceptually-Based Contrast Modification,” IEEE J. Sel. Top. Signal Process. 8(3), 490–503 (2014).
[Crossref]

IEEE Trans. Image Process. (2)

M. Bertalmío, V. Caselles, E. Provenzi, and A. Rizzi, “Perceptual color correction through variational techniques,” IEEE Trans. Image Process. 16(4), 1058–1072 (2007).
[Crossref] [PubMed]

P. Zolliker and K. Simon, “Retaining Local Image Information in Gamut Mapping Algorithms,” IEEE Trans. Image Process. 16(3), 664–672 (2007).
[Crossref] [PubMed]

J. Inf. Comput. Sci. (1)

G. Song, “Skin Color Region Protect Algorithm for Color Gamut Extension,” J. Inf. Comput. Sci. 11(6), 1909–1916 (2014).
[Crossref]

J. Opt. Soc. Am. (1)

Opt. Express (2)

Psychol. Rev. (1)

L. L. Thurstone, “A Law of Comparative Judgment,” Psychol. Rev. 34(4), 273–286 (1927).
[Crossref]

Other (12)

N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 Color Appearance Model,” in Color and Imaging Conference (2002), pp. 23–27.

J. Morovic, Color Gamut Mapping (John Wiley & Sons, 2008).

M. Sakurai, T. Nakatsue, Y. Shimpuku, R. L. Heckaman, and M. D. Fairchild, “Evaluation of Gamut-Expansion Algorithms for Wide-Gamut Display,” in SID Symposium Digest of Technical Paper (2009), pp. 1006–1009.
[Crossref]

G. Ward, H. Yoo, A. Soudi, T. Akhavan, G. Ward, H. Yoo, A. Soudi, and T. Akhavan, “Exploiting Wide-gamut Displays,” in Proceedings of Color Imaging Conference (2016), pp. 163–167.

Y. Li, G. Song, and H. Li, “A multilevel gamut extension method for Wide Gamut Displays,” in Proceedings of Electric Information and Control Engineering (2011), pp. 1035–1038.

S. W. Zamir, J. Vazquezcorral, and M. Bertalmío, “Gamut extension for cinema: psychophysical evaluation of the state of the art and a new algorithm,” in proceedings of Human Vision and Electronic Imaging (2015), p. 93940U.

Commission Internationale de l’Éclairage (CIE), “Recommendations on Uniform Color Spaces, Color Difference Equations, Psychometrics Color Terms,” in Supplement No. 2 of CIE Publication No. 15 (1971).

Commission Internationale de l’Éclairage (CIE), “Guidelines for the Evaluation of Gamut Mapping Algorithms,” in CIE Publication No.156 (2003).

Society of Motion Picture and Television Engineers (SMPTE), “Digital Cinema Quality - Reference Projector and Environment,” in SMPTE RP 431–2 (2011).

International Telecommunication Union (ITU), “Parameter values for ultra-high definition television systems for production and international programme exchange,” in ITU-R Recommendation BT.2020 (2012).

International Telecommunication Union (ITU), “Image parameter values for high dynamic range television for use in production and international programme exchange,” in ITU-R Recommendation BT. 2100 (2017).

International Electrotechnical Commission (IEC), “Multimedia Systems and Equipment-Color Measurement and Management-Part 2-1: Color Management-Default RGB Color Space-sRGB,” in IEC 61966–4 (1999).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Illustration of vividness direction. The points in the vectors away from black were said to have different vividness values.
Fig. 2
Fig. 2 Mapping towards the lightness axis. P is the source color and P is the mapped color after Step 1. P b and P b ' are the intersection of line EP with the source and the newly constructed gamuts, respectively. E is the mapping center on the lightness axis that has the same lightness value as point P.
Fig. 3
Fig. 3 Illustration of vividness extension. The source color P is mapped to P along the vividness direction. E is the focal point (black) in the destination gamut. P l is the intersection of E P with the lower source boundary. P h is the intersection of E P with the higher source boundary. P hh is the intersection of E P with the higher destination boundary.
Fig. 4
Fig. 4 Illustration of chroma extension. E is the mapping center on the lightness axis that has the same lightness value as P . P l is the intersection of E P with the source gamut after Step 2 and P d is the intersection of E P with the destination boundary. P out is the final output.
Fig. 5
Fig. 5 The comparison of the sRGB, DCI-P3 and the display gamuts in the CIE 1976 u'v' diagram.
Fig. 6
Fig. 6 Different hue planes for the source gamut (display gamut) and the destination gamut (sRGB), as well as a vertical view of both gamuts.
Fig. 7
Fig. 7 The seven test images: (1) Building, (2) Lover, (3) Fruit, (4) Bay, (5) Garden (6) Spring, and (7) Animation. Image 7(1) was repeated.
Fig. 8
Fig. 8 The experimental setup.
Fig. 9
Fig. 9 The overall result of GEA experiment: z-score is plotted for each GEA and for each UCS.
Fig. 10
Fig. 10 The performance of each GEA and UCS combination for each individual image. The results are scaled such that SDS (same driving signal) has a value of 0 and, hence, results for that algorithm are not shown.
Fig. 11
Fig. 11 Halos found in areas adjacent to the roof. Left is the original image and right is the reproduction processed with the Zamir et al. algorithm.
Fig. 12
Fig. 12 Image dependency for different GEAs and UCSs.

Tables (2)

Tables Icon

Table 1 The % Wrong Decisions representing the intra-observer and inter-observer variability.

Tables Icon

Table 2 The ranking of GEA and UCS performance based on 7 images (1 = best, 9 = worst, C = CAM02-UCS, J = Jzazbz)

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

E P ' ¯ ={ EP ¯ ; EP ¯ 0.6× E P b ¯ 0.6× E P b ' ¯ + EP ¯ 0.6× E P b ' ¯ E P b ¯ 0.6× E P b ' ¯ ×0.4× E P b ' ¯ ; EP ¯ >0.6× E P b ¯
E P ¯ = E P hh ¯ ×( E P ¯ E P h ¯ k×v×μ)
v= E P b ¯ E P w ¯ ,k= P P h ¯ P l P h ¯ andμ=1 C P C CUSP
E P out ¯ ={ E P ¯ ; E P ¯ 0.6× E P l ¯ 0.6× E P l ¯ + E P ¯ 0.6× E P l ¯ 0.4× E P l ¯ ×( E P d ¯ 0.6× E P l ¯ ); E P ¯ >0.6× E P l ¯
CI=A±1.96 σ N =A± 1.96 N * 2

Metrics