Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Perceptually uniform color space for image signals including high dynamic range and wide gamut

Open Access Open Access

Abstract

A perceptually uniform color space has been long desired for a wide range of imaging applications. Such a color space should be able to represent a color pixel in three unique and independent attributes (lightness, chroma, and hue). Such a space would be perceptually uniform over a wide gamut, linear in iso-hue directions, and can predict both small and large color differences as well as lightness in high dynamic range environments. It would also have minimum computational cost for real time or quasi-real time processing. Presently available color spaces are not able to achieve these goals satisfactorily and comprehensively. In this study, a uniform color space is proposed and its performance in predicting a wide range of experimental data is presented in comparison with the other state of the art color spaces.

© 2017 Optical Society of America

1. Introduction

The display industry is evolving from ‘standard’ color gamut and dynamic range towards wide color gamut (WCG) and high dynamic range (HDR), along with higher spatial resolution. The brightness and contrast of display devices have improved in recent years and are expected to further improve in the future. Currently, many consumer displays can achieve a peak luminance of 500cd/m2and some high-end displays available in the market are able to reach a peak luminance of above 1000cd/m2 [1]. The achievable luminance range is expected to further increase in the near future. A psychophysical study conducted at Dolby Laboratories suggested that an approximate dynamic range to satisfy 90% of viewers is from 0.005cd/m2to 3000cd/m2for the diffuse white [2]. Dolby’s highlight study also suggested that the upper level of the luminance range should at least be 10,000cd/m2 for preferred highlight reproduction [2]. The International Telecommunication Union (ITU) has recommended BT.2020 (or Rec.2020) primaries for the next generation ultrahigh definition (UHD) broadcasting system and its format [1]. The Rec.2020 gamut is wider than display color encodings such as ITU-R BT.709, sRGB, Adobe RGB, or DCI-P3, and covers 99.9% of the Pointer gamut (naturally occurring colors) [1,3]. A color space is desired that is perceptually uniform in wide gamut such as Rec.2020, can predict wide-range of lightness, and has minimal inter-dependence between its perceptual attributes (lightness, chroma, and hue).

Some applications of a perceptually uniform color space are discussed here. The performance of image processing algorithms (e.g., gamut mapping, lossy image compression, image enhancement, image segmentation, image denoising etc.) can be enhanced by using a perceptually uniform color space as the color encoding [4]. For example, a study of image compression algorithms using different color spaces found that a more uniform color space can achieve better image compression performance [5]. If the color encoding is perceptually uniform, the color difference between two images of same scene but different quality or appearance can be calculated using the Euclidean distance between correlates of the color space by employing spatial filtering [6]. There are many more applications of a uniform color space such as image quality modeling [7], image color appearance modeling [8], device characterization [9], color rendering metrics [10] etc.

A color space in which equal distances are intended to represent threshold or suprathreshold perceived color differences of equal size is referred to as a uniform color space [11]. In 1976, the Commission Internationale de l’Éclairage (CIE) recommended two different uniform color spaces named CIELAB and CIELUV to predict perceptual color differences for reflective and self-illuminant colors, respectively [12]. Later it was found that the Euclidean distances between correlates of CIE recommended color spaces do not correspond to perceptual color differences and a number of non-Euclidean color difference formulas (e.g., ΔECMC [13], ΔE94 [14], ΔE00 [15] etc.) were developed. Although such formulas can accurately predict perceptual color differences, they cannot be used for perception-based image processing because they do not have an associated color space. In image applications, a color space in which the Euclidean metric provides a color-difference formula would therefore be convenient, even if the formula does not accurately predict perceived color differences [4].

Considerable efforts have been made to develop a uniform color space which can predict perceptual color difference based on the Euclidean distance between its correlates with minimum inter-dependence between them. Hung and Berns [16] and Ebner and Fairchild [17] conducted different experiments to scale constantly perceived hue and found that existing models could not avoid inter-dependence between chroma and hue and are not suitable for gamut mapping. The uniform color space IPT was developed in 1998 intended to provide improved hue linearity [18]. Lissner and Urban [4] developed a number of empirical color spaces with the idea of transforming an initial color space to new correlates that represent a better uniform color space. They used the Hung and Berns data [16] as a reference of constant perceived hue and the ΔECMC [13], ΔE94 [14], and ΔE00 [15] formulas as measures of perceptual uniformity. They proposed a number of different models, by achieving a tradeoff between perceptual uniformity and hue linearity, to be used depending on the application. Their transformation method was based on lookup tables and interpolation which increases the computational cost. The corresponding lookup table data are always needed to implement the color space. The lookup tables were computed for a limited range of lightness (CIE L*100) which limits the scope of the developed color spaces, and are not analytically invertible, which is a disadvantage.

Luo et al. [19], proposed three uniform color spaces based on attributes of the CIE standard color appearance model CIECAM02. They developed three color spaces to predict small color differences, large color differences, and a combination of both, called CAM02-SCD, CAM02-LCD, and CAM02-UCS, respectively. CIECAM02 and CAM02-UCS have been widely used in many applications. There have however, been reported unexpected computational failures in CIECAM02 (and hence in CIECAM02 based uniform color spaces) [20]. Revised versions of CIECAM02 and CAM02-UCS (named CAM16 and CAM16-UCS, respectively) have recently been proposed to solve this problem without affecting their performance [20]. Although CAM16-UCS offers good overall perceptual uniformity it does not preserve hue linearity, particularly in the blue hue region, and is computationally expensive compared with almost all other available models [21]. In addition, none of the above mentioned color spaces were explicitly developed for high dynamic range applications [22].

During recent years, researchers have focused on perception-based encoding of high dynamic range image signals. Fairchild and Wyble [22,23] modified CIELAB and IPT by replacing their power function non-linearity by the Michaelis-Menten equation and optimized its coefficients to predict the high dynamic range experimental lightness data (lightness differences above and below diffuse white). They found that the new color spaces (hdr-CIELAB and hdr-IPT) did not perform better than their traditional versions to predict wide-range lightness and also showed some discrepancies in prediction of Munsell Chroma. The Society of Motion Picture & Television Engineers (SMPTE) [24] has recommended a non-linear function called a perceptual quantizer (PQ) function to encode high dynamic range luminance. The PQ function (based on the Barten contrast sensitivity functions [25]) uses increments of just-noticeable-difference (JND) to avoid visible quantization artifacts, and can uniformly encode a luminance range of 0.001to 10,000cd/m2. Dolby has recently proposed an HDR encoding space named ICTCP (a previous version was called ICaCb) [26,27]. This color space followed the same structure as IPT [18] but replaced its power function non-linearity by the PQ function. It was claimed that this Dolby model has outperformed state of the art HDR encodings proposed by the International Telecommunications Union (ITU) in terms of uniformity and hue linearity. These color spaces need to be tested using a comprehensive and suitable range of experimental data. For example, the reference for perceptual uniformity was the MacAdam ellipses data which was obtained by a single observer using a light stimulus in a visual colorimeter.

All of the color spaces referred to above have advantages and disadvantages. A challenge for color scientists is to develop a uniform color space encoding which satisfactorily achieves all the desired objectives including perceptual uniformity, linearity in iso-hue directions (most challenging is the blue hue), grey scale convergence, prediction of small and large color differences, and lightness of pixels with higher luminance than the diffuse white, with minimum computational cost. The current study was intended to cope with this challenge. The performance of a number of selected color spaces is investigated and a new uniform color space, Jzazbz, is proposed. The performance results of the proposed color space are reported compared with state of the art uniform color spaces to predict a comprehensive range of experimental data.

The following three sections of the paper introduce criteria for testing the performance of a uniform color space, corresponding experimental data sets used to evaluate the performance, and a short introduction to the test color spaces investigated in this study. Details of the proposed color space model are then given. Conclusions are drawn after reporting the test results. Finally, appendices are given to provide the reverse model of the proposed color space, and show different experimental data plotted in difference test spaces.

2. Test criteria

A number of criteria were considered while developing a perceptually uniform color space for a wide range of applications and corresponding experimental data were collected to test its performance compared with other spaces. A statistical measure called the standard residual sum of squares (STRESS) [28] was used to indicate dissimilarity for quantitative analysis of predictions of experimental data in most of the tests. STRESS ranges from 0 to 100 where 0 means 100% agreement. The standard deviation (SD) computed between hue angles of all samples in a color tuple was used to evaluate the prediction of perceived constant hue. Six criteria that are considered to measure the performance of a color space are given below.

  • 1. Perceptual Color Difference prediction is an important property of a uniform color space, where the Euclidean distance between its perceptual attributes can predict perceived color difference. STRESS values were computed between the experimental and the predicted color differences for all sample pairs within a given data set.
  • 2. Perceptual Uniformity of the color spaces was tested in terms of local and global uniformity. Local uniformity means that color discrimination ellipses should be rounder (ideally circles) and global uniformity means that the size of all the ellipses should be similar (ideally equal) to each other. Let N be the number of ellipses in a given data set, and Aiand Bi represent the major and minor semi-axis of the ith ellipse, respectively. For the local uniformity test, STRESS was computed between a vector containing the ratios ([A1/B1,A2/B2,…,AN/BN]) of the major to minor semi-axis of the given N ellipses and a vector of unity with N entries, i.e., all ellipses become circles. To test global uniformity, STRESS was computed between the areas ([πA1B1,πA2B2, …,πANBN]) of the givenN ellipses and a vector of N entries all equal to a constant value represented by the mean area, i.e., all ellipses should be equal in size.
  • 3. Wide-range Lightness prediction was also one of the test criteria i.e., predicting perceptual lightness below and above diffuse white. STRESS was computed between the experimental and predicted lightness for each of the corresponding data sets.
  • 4. Hue Linearity relates to the prediction of experimental iso-hue data. Prediction error may be caused by two reasons: (a) if iso-hue data points do not follow a linear line, or (b) if iso-hue data points follow a linear line but the line does not converge to the origin (0,0). The standard deviation (SD) was used as a metric to determine the overall hue prediction error. Let there beMcolor tuples within a given data set. The SD between the hue angles (in units of degree) predicted by a color space for each color tuple was calculated. The SD was then averaged overMdifferent color tuples to quantify the overall performance of the test space in predicting constant perceived hue. The smaller the values of SD, the better the performance.
  • 5. Grey-scale Convergence means no inter-dependence between the lightness and chroma attributes of the color space. In an ideal uniform color space the lightness axis should be orthogonal to the chroma axis such that all neutral colors have a chroma of zero. A chroma-ratio (%) metric used for this test is defined below.
    Chromaratio=1003CwCr+Cg+Cb

    where Cwrepresents the chroma of the CIE D65 white point computed in the test space while Cr, Cg, and Cbrepresent the chroma of the red, green, and blue vertices (CIE 1931 xy chromaticity) of the sRGB gamut, respectively, also computed in the test space.

  • 6. Computational Cost should be kept to a minimum for practical applications of a color space, especially for real time or quasi-real time image processing.

Beyond testing for the above criteria, different characteristics of the proposed color space will also be discussed such as the prediction performance of its individual correlates, and representation of the CIE 1931 chromaticity scale and Rec.2020 gamut hull in the uniform color space.

3. Experimental data

A wide range of experimental data were collected to evaluate the performance of the proposed color space compared with other selected color spaces in accordance with the criteria described above. The data were divided into training, reference, and testing sets. The training data were used to derive the model and the testing data to test the models’ performance. The reference data sets were used to analyze the trade-off between uniformity and hue linearity during development of the model.

  • Combined Visual Data (COMBVD) set represents a combination of small color difference data. This data set was used to derive the state of the art color difference formula, ISO/CIE recommended CIEDE2000 [15], and uniform color spaces such as CAM02-UCS [19]. It consists of 3813 pairs of samples with an average color difference of 2.6in CIELAB units. The COMBVD set consists of four different data sets including RIT-DuPont [29], Witt [30], Leeds [31], and BFD [32], made up of 312, 418, 307, and 2776 pairs of samples, respectively. The COMBVD data set was used as a training data set to derive the proposed uniform color space.
  • Optical Society of America (OSA) data made up of 128 pairs of samples with color differences of large magnitude (about 14 CIELAB units on average) were also used. These data were used by OSA to develop a color difference formula and corresponding uniform color space (called OSA Ljg) [33]. The OSA data set was used as one of the testing sets in the present study.
  • Munsell data set represents measurements of 1625 real Munsell samples and these data were obtained from the RIT website [34]. This data set also represents large color difference (about 10 CIELAB units on average) and was used to test the models’ ability to predict individual correlates (Value, Chroma, and Hue) of Munsell color order system.
  • COMBVD ellipses data set is a subset of the COMBVD data set to represent the data in terms of chromatic ellipses (125 color centers). Each ellipse is represented as semi-major axis (A), semi-minor axis (B), and orientation angle (θ). Those ellipses were plotted in different color spaces to visually compare their performance: ideally, all ellipses should be equal size circles when plotted in a uniform color space. This data set was used as a reference for perceptual uniformity. Note that this set of ellipses is derived from the COMBVD data and seems to be duplicated but it divides uniformity into two parts, local and global.
  • MacAdam ellipses data set represents MacAdam (observer PGN) just-noticeable-difference (JND) ellipses (25 color centers) [35]. The data have been used as a reference of uniformity while developing spaces like ICTCP and IPT. The data were obtained by one observer using a light stimulus visual colorimeter. The purpose of using the data was to test the uniformity of the models in an expanded color gamut.
  • Hung and Berns [16] conducted an experiment to determine constant perceived hue on a display. The data set used in the current study includes 48 samples for 12different hues with constant lightness and 4 levels of chroma for each hue. They found that existing color spaces at that time could not predict the iso-hue data so a new color space was needed for gamut mapping applications. Some color spaces have been developed considering this data as a reference for hue linearity [4, 26, 27]. This data set was used as reference of constant perceived hue in the current study.
  • Ebner and Fairchild [17] conducted a psychophysical experiment to determine constant hue surfaces in a color space. There are 15 different color tuples with about 21samples per hue with varying chroma and lightness. This data has been previously used to present the performance of IPT and is used as a test data set in the current study.
  • Xiao et al. [36], conducted an experiment to scale four unique hues (Red, Yellow, Green, and Blue) for color appearance modeling. Nine different combinations of lightness and chroma were used to scale each of the unique hues. In total, 36 samples are included. This data set was also used as a testing data set.
  • Wide-range Lightness data were generated by Fairchild et al. [23], who conducted two different experiments to scale lightness above and below diffuse white (CIE L*=100). In the Scaling Lightness Experiment 1 (SL1) they used a luminance range from 156 to 3692cd/m2 with Yn=842cd/m2 (Yn represents the luminance of reference white) whereas in the Scaling Lightness Experiment 2 (SL2) the luminance range was extended from 0 to 7432cd/m2with Yn=997cd/m2. The SL2 data set was used to drive the adapted lightness (Jz) formula of the proposed color space (see later) while the SL1 data set was used as a test data set. Each of the sets includes 19 samples.

Table 1 lists all the experimental data sets (column 1) along with the purpose of their use (column 2), the reference white point (column 3), and the corresponding viewing parameters (used for calculating CIECAM02 and CAT02) including the adapting field luminance (La) taken as 20% (background luminance factor Yb=20) of the luminance of white (column 4) and surround conditions (impact of surround (c), chromatic induction factor (Nc), factor for degree of adaptation (F)) (column 5) used to predict that particular data. The adaptation and surround (average, dim, or dark) parameters and white points used here adequately correspond to the experimental conditions which have been used to generate the data sets [18, 23, 36]. Note that only two data sets were used to drive the Jzazbz model and the others are either reference or testing sets.

Tables Icon

Table 1. Details of different experimental data sets and corresponding appearance model parameters.

4. Color spaces tested

The five most widely used or best-performing color spaces were tested here, i.e., CIELAB, CIELUV, IPT, CAM16-UCS, and ICTCP. They were all developed with the aim of perceptual uniformity. All of these color spaces have one achromatic component (lightness) and two chromatic components (redness-greenness and yellowness-blueness). CIELAB and CIELUV are the current ISO/CIE standard color spaces recommended to evaluate perceived color differences using their associated Euclidean distances [12]. CIELAB (L*,a*,b*) is recommended for the colorant industries (surface colors) while CIELUV (L*,u*,v*) for the display (self-luminous colors) industries. IPT (I,P,T) well known for its hue linearity has widely been used for gamut mapping applications [18]. Another test space is CAM16-UCS (J',aM,bM) which is based on CAM16 color appearance model [20]. CIECAM02 was recently revised to overcome some mathematical problems in the model and the revised version is CAM16 [20,37]. In this study, CAM16-UCS was selected instead of CAM02-UCS to avoid any computational failures while processing high dynamic range and wide gamut color difference signals. Note that these two spaces were very similar in structure and gave very similar performance. Most of the above mentioned color spaces have been used for the standard dynamic range imaging i.e., up to approximately500cd/m2.

The fifth test space is ICTCP (I,CT,CP) which is the Dolby proposal for high dynamic range and wide gamut imaging applications [27]. It has already been shown in previous studies [26, 27] that the Dolby model outperformed all state of the art encodings previously proposed by the International Telecommunication Union for HDR and WCG imagery, in terms of hue linearity and JND uniformity. Finally, a new uniform color space, Jzazbz, was developed in the current study for image signals including high dynamic range and wide gamut as well as typical dynamic range. The performance of Jzazbz was compared with the other test spaces based on the prediction of a comprehensive range of experimental data (see Table 1).

5. Development of Jzazbz uniform color space

Different stages of the development of the proposed uniform color space, Jzazbz, are described, and the full model is also given in this section.

5.1 Development of Jzazbz

The Dolby researchers developed the ICTCP color space where the idea was to use a similar structure as IPT but to replace the non-linear function by a perceptual quantizer (PQ) function which can accurately predict a luminance range from 0.001to 10,000cd/m2 [18, 24, 26, 27]. In our preliminary study [21], the same structure as the Dolby model was used to develop a color space with improved uniformity and this has been extended in the current study. The model under development was initially based on the following three equations:

[LMS]=[α1,1α1,21α1,1α1,2α2,1α2,21α2,1α2,2α3,1α3,21α3,1α3,2][XD65YD65ZD65]
{L',M',S'}=(c1+c2({L,M,S}10000)n1+c3({L,M,S}10000)n)p
[Izazbz]=[ω1,1ω1,21ω1,1ω1,2ω2,1ω2,2ω2,1ω2,2ω3,1ω3,2ω3,1ω3,2][L'M'S']
where XD65, YD65, and ZD65belong to the input CIE XYZ tristimulus space with CIE standard illuminant D65 as white point, and c1=3424/212, c2=2413/27, c3=2392/27, n=2610/214, and p=e×2523/25. The factor e in Eq. (3), and coefficients (αi,j and ωi,j for all iandj) of two matrices in Eq. (2) and Eq. (4) are the variables of optimization fitted using the training data. Note that there are six degrees of freedom in each of the matrices in Eq. (2) and Eq. (4). The mathematical formulation is similar to that in Froehlich et al. [26].

Equation (2) represents transformation of CIE XYZ (D65) tristimulus values to cone primaries (L, M, S). Equation (3) represents the perceptual quantizer (PQ) curve which is a dynamic transform of cone responses. Finally, Eq. (4) gives correlates of the opponent color space. These three equations provide a physiologically plausible color space. In the preliminary study [21], the intention was to re-optimize the coefficients (αi,j and ωi,j for all iandj) of both matrices along with the compression factor (p) of the non-linear equation with the aim to increase perceptual uniformity without increasing computational cost. The color space was optimized using the COMBVD ellipses (125 color centers) data set [15] and the Xiao et al. [36] data set to improve uniformity and hue linearity, respectively. There was a successful improvement in the uniformity (both local and global) compared with ICTCP. Uniformity was very similar to CAM16-UCS, which was found to be best for perceptual uniformity among the available color spaces but has a very complex structure [21]. However, it can be observed that there is a large hue shift in the blue region (SD=10.9) i.e., the blue hue shifts towards purple with increase in chroma when the model was tested using the Hung & Berns data [16] plotted in Fig. 1(a). The reason may be that the Xiao et al., data set (used for training [21]) has a smaller color gamut (especially in blue direction) compared with the Hung & Berns data. Such a large hue shift can be problematic in many applications especially gamut mapping. It was observed that the increase in value of e beyond 1 (up to a certain value) compresses the space and helps to improve uniformity (especially global uniformity), but as a result blue hue shifts towards the purple at higher chroma levels. It is well known that uniformity and hue linearity adversely affect each other while developing a uniform color space, and in previous work a compromise has always been achieved [4, 26].

 figure: Fig. 1

Fig. 1 Plots of the Hung & Berns constant hue data; (a) hue linearity using the structure similar to the Dolby model but re-optimizing the coeffients of matrices in Eq. (2) and Eq. (4), and the power factor in Eq. (3) [21], (b) hue linearity after extending the model using Eq. (5) and re-optimizing with b=1.16, and (c) hue linearity after extending the model using Eq. (6) and re-optimizing with b=1.15 and g=0.66. The solid black lines are drawn based on linear orthogonal fitting whereas the dashed black lines in the blue direction are drawn hypothetically to show the desired linearity.

Download Full Size | PDF

The main aim of the current study was to minimize the trade-off between uniformity and hue linearity. Note that to improve local and global uniformity, data based on only chromatic differences were used to derive a preliminary model [21] and lightness differences were not considered. Based on the results of the preliminary study [21], it was realized that the color space should be optimized in three dimensions to minimize inter-dependence between the three perceptual attributes lightness, chroma, and hue. Thus three decisions had to be made.

Firstly, instead of using MacAdam ellipses, a three dimensional color difference data set (the COMBVD data set) was used to optimize the color space. As mentioned above, this data set has been considered the most important data set for the derivation of color-difference equations. Most importantly, it has 3813 pairs covering color differences for all directions in color space, i.e., lightness, chroma, hue, and their mixtures. The STRESS between predicted and experimental color differences was used as a measure of the optimization.

Secondly, since optimization of a color space for perceptual uniformity adversely affects the hue linearity (particularly the blue hue which deviates towards the purple): an engineering technique was used to minimize this tradeoff. The input XD65 was pre-adjusted with respect toZD65 (which corresponds to response of the blue cone) to remove the deviation in the blue hue prior to optimizing the model to improve the perceptual uniformity. The aim was to make sure that even after the optimization for perceptual uniformity the deviation in the blue hue remained in a plausible range (or is represented by an acceptably small curvature). A similar technique was used by Cui et al. [38], who implemented the original idea proposed by Kuehni [39] to improve the performance of the DIN99 color difference formula for blue chromatic differences and proposed two uniform color spaces DIN99c and DIN99d. The linear equation used for adjustment of XD65is given below.

XD65'=bXD65(b1)ZD65
where variables XD65 and ZD65belong to the input CIE XYZ tristimulus space and XD65'replaces the XD65 in Eq. (2). The variable bin Eq. (5) was optimized along with other optimization variables by minimizing the STRESS between the experimental color difference and its prediction using Euclidean distance between perceptual correlates of the color space under development. The value of (b1) was kept as small as possible, because it adversely affects uniformity, while achieving the desired goal of hue linearity in all directions especially in the blue region. It was found that the adjustment of XD65 provides robustness against shift in the blue hue at the cost of losing some uniformity in the green-blue region. The Hung & Berns iso-hue data are plotted in Fig. 1(b) after extension of the model using Eq. (5).

To achieve the uniformity, similar to that without the use of Eq. (5), particularly in green-blue region which was more affected, Eq. (5) was extended to adjust the inputYD65with respect to inputXD65 and the modified equation is given below.

[XD65'YD65']=[bXD65gYD65][(b1)ZD65(g1)XD65]
Then a final optimization was performed to achieve maximum uniformity while preserving hue linearity, following an iterative process described below.

  • 1. Start with b=1+δ(δis an arbitrary small number), g=1, and other variables in Eq. (2) to Eq. (4) the same as ICTCP [27].
  • 2. Optimize variablesg,e, and coefficients (αi,j and ωi,j for all iandj), using COMBVD as training data.
  • 3. Examine the performance visually and quantitatively to make the trade-off between uniformity (considering the COMBVD ellipses as reference) and the hue linearity (considering the Hung & Berns data as reference).
  • 4. End the optimization if the hue linearity (particularly the deviation in the blue hue) is in the plausible range (i.e., similar to that of IPT) and maximum possible uniformity is achieved. Otherwise, increase the value of δ and go to step-2, and continue until a minimum trade-off is achieved between hue linearity and uniformity.

TheXD65'and YD65' values obtained from Eq. (6) replace the input values XD65andYD65in Eq. (2), whileZD65remains unaltered. The Hung & Berns constant hue data are plotted in Fig. 1(c) after extension of the model using Eq. (6) and the iterative optimization of the model. Comparing Fig. 1(c) with Fig. 1(b), it can be observed that the implementation of Eq. (6) to improve uniformity (see later results) also improved the overall hue linearity that was achieved using Eq. (5). The average SD values for three versions plotted in Fig. 1(a-c) are 4.8, 3.7, and 2.7, respectively, while SD values for the blue tuple are 10.9, 2.9, and 3.1, respectively.

Thirdly, a further aim was to accurately predict perceptual lightness in highlights as well as in typical dynamic range applications. Another simple equation was derived to tune the current lightness correlate Jzto predict the experimental lightness in high dynamic range without affecting the performance to predict the COMBVD data set for which Iz was optimized. An equation similar to the lightness formula of CAM02-UCS [19] was derived for Jz and is given below.

Jz=(1+d)Iz1+dIz
The factor dwas optimized using the experimental lightness data (SL2). Note that this optimization did not impact on the performance in predicting COMBVD data but significantly improved the prediction of wide-range lightness data (see later results). The variable Jz represents the perceptual lightness correlate of the current uniform color space named Jzazbz. While testing the validity of the color space for a range of input values, it was found that a small value (d0=1.62954995328214×10-11) should be subtracted from Jz in Eq. (7) to account for the offset caused in lightness for the input[0,0,0]. The factord0is important particularly in the reverse model to avoid computational failure i.e., occurrence of complex numbers forJzd0. Note that using such a precise value of d0 does not mean that the value of Jzneeds to be given with such a high precision. The full model of the proposed color space is given in the following subsection. MATLAB code is given in Code 1 file (Ref [40].).

5.2 Full model of Jzazbz

The inputs to the model are the absolute values of the CIE XYZ tristimulus values (with reference to CIE standard illuminant D65 and CIE 1931 standard colorimetric observer). Outputs of the model will be an achromatic component (lightness (Jz)) and two opponent color components (redness-greenness (az) and yellowness-blueness (bz)) of the Jzazbz uniform color space. The following five equations represent the forward transformation from CIE XYZ to Jzazbz color space.

[XD65'YD65']=[bXD65gYD65][(b1)ZD65(g1)XD65]
[LMS]=[0.414789720.5799990.0146480-0.20151001.1206490.0531008-0.01660080.2648000.6684799][XD65'YD65'ZD65]
{L'M'S'}=((c1+c2({LMS}10000)n)(1+c3({LMS}10000)n))p
[Izazbz]=[0.50.503.524000-4.0667080.5427080.1990761.096799-1.295875][L'M'S']
Jz=(1+d)Iz1+dIzd0
where b=1.15, g=0.66, c1=3424/212, c2=2413/27, c3=2392/27, n=2610/214, p=1.7×2523/25, d=0.56, and d0=1.6295499532821566×10-11.

It is immediately apparent that the proposed color space is invertible and the inverse model is given in Appendix A. The formulas to compute chroma, hue angle, and perceptual color difference in the current uniform color space are given in Eq. (13), Eq. (14), and Eq. (15), respectively.

Cz=(az)2+(bz)2
hz=arctan(bzaz)
ΔEz=(ΔJz)2+(ΔCz)2+(ΔHz)2
whereΔHz=2Cz1Cz2sin(Δhz2).

6. Results and discussions

A uniform color space named Jzazbz has been developed. The performance of the proposed color space was tested using a wide range of experimental data and it was compared with the other six models tested in the current study. Table 2 summarizes the prediction performance of each color space in terms of STRESS (0100). The minimum value of the STRESS in each column is marked in bold and underlined while the second best is marked in bold only.

Tables Icon

Table 2. Results for test color spaces to predict different experimental data sets presented in STRESS units.

The perceptual color difference data were predicted using the Euclidean distance of perceptual correlates in each color space and the STRESS was computed between the experimental and the predicted color differences. The results in Table 2 show that CAM16-UCS was best at predicting small color difference data, while Jzazbz was second. When predicting large color difference data (e.g., OSA), Jzazbz and CAM16-UCS performed best followed by IPT. CIELUV and ICTCP gave the worst performance on average in predicting both the COMBVD and the OSA data sets representing small and large color differences, respectively. That the current model does not perform better than CAM16-UCS in predicting COMBVD is due to the compromise to accommodate hue linearity as discussed earlier.

To test perceptual uniformity, two different data sets (COMBVD ellipses and MacAdam ellipses) were used. The results in Table 2 showed that to predict the COMBVD ellipse data for both local and global uniformity, CAM16-UCS performed the best followed by Jzazbz. Both CIELUV and ICTCP again gave poor performance. These results were expected to agree with that of the prediction of COMBVD. The plots of the COMBVD ellipse data are shown in Fig. 2 for visual comparison. Note that the yellowness-bluesness component of ICTCP is plotted inverted to aid he comparison with the other color spaces and is labled as CTin all the following figures in this paper. Inspection of Fig. 2 shows that the chromaticity ellipses plotted in CIELAB, CIELUV, IPT, and ICTCP are more elongated and irregular in size compared with those in CAM16-UCS and Jzazbz.

 figure: Fig. 2

Fig. 2 The COMBVD ellipses data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP,, and (f) the proposed Jzazbz.

Download Full Size | PDF

When predicting the MacAdam ellipse data, current Jzazbz gave the best performance for both local and global uniformity followed by IPT for local and ICTCP for global uniformity. CIELAB performed worst overall to predict the MacAdam ellipse data. The results for the MacAdam ellipses and COMBVD ellipses showed a slightly different trend that may be because the former data set covers a relatively wider gamut. This could also mean that Jzazbz can be used to uniformly encode wide gamut image signals. The MacAdam ellipses are plotted in color spaces proposed for HDR and WCG applications (ICTCP and Jzazbz) along with the Rec.2020 gamut hull (with peak luminance 1000cd/m2) in the background in Fig. 3. Inspection of Fig. 3 shows that the chromaticity ellipses in the ICTCP space are more irregular (especially in the green region) compared with Jzazbz.

 figure: Fig. 3

Fig. 3 The MacAdam ellipses (10 times amplified) plotted in: (a) ICTCP, and (b) the proposed Jzazbz. The colored background represents the Rec.2020 gamut hull with a peak luminance of 1000cd/m2and a white point corresponding to CIE standard illuminant D65.

Download Full Size | PDF

The performance of the lightness correlates of all the test color spaces was investigated to predict the wide-range lightness data (SL1 and SL2) i.e., including color samples with higher luminance than the diffuse white. The SL2 data was also used to drive the lightness formula of Jzazbz. The results in Table 2 showed that CIE L* best predicted the wide-range lightness data followed by Jz of Jzazbz and I of IPT. ICTCP gave the worst performance to predict wide-range lightness data. CAM16-UCS, which showed promising performance for uniformity, gave a reasonable prediction of wide-range lightness data. Predictions of SL1 and SL2 using different lightness predictors are plotted in Fig. 4(a) and Fig. 4(b), respectively.

 figure: Fig. 4

Fig. 4 Plots of lightness predictors; (a) Prediction of SL1 data by different lightness predictors, (b) Prediction of SL2 data by different lightness predictors, (c) Different lightness predictors plotted against CIE L*, and (d) Prediction of Munsell Value by different lightness predictors. All the lightness predictors are scaled to the range of CIE L*.

Download Full Size | PDF

The behavior of the current lightness predictor (by varying d) and that of other lightness predictors is compared with CIEL*. Figure 4(c) shows different lightness predictors plotted against CIE L* considering a luminance range of 0 to 10,000cd/m2and the chromaticity of CIE standard illuminant D65. Figure 4(c) shows that the behavior of the current Jz is very similar to CIE L* and can also be tuned to lightness predictors of different color spaces just by varying the value of d i.e., Jz can be approximately tuned to lightness predictors of CAM16-UCS, IPT, and ICTCP by setting d equal to0, 0.64, and 1.1, respectively.

The three perceptual correlates (lightness, chroma, and hue) of the test color spaces were then tested individually to predict Munsell Value, Chroma and Hue. STRESS was computed between the Munsell perceptual correlates and corresponding predictions of different test spaces. Note that the Munsell experiment was conducted to scale each correlate individually so there is no need to combine them together. A set of tristimulus values (CIE XYZ) was obtained using the chromaticity of CIE standard illuminant D65 and by transforming the Munsell Value to luminance (Y) using a fitted quantic parabola given in Eq. (16) [41]. The data were used to predict Munsell Value using different lightness predictors. Prediction results of lightness are plotted against normalized luminance as shown in Fig. 4(d) and corresponding STRESS values are given in Table 2. The results showed that CIE L* and Jzazbz gave the best performance followed by CAM16-UCS and IPT to predict Munsell Value. Again, the lightness correlate of ICTCP performed worst. The lightness predictor of ICTCP did not perform well in predicting both the wide-range lightness data and Munsell Value.

Y=1.2219V0.23111V2+0.23951V30.021009V40.0008404V5

Munsell Chroma and Hue were predicted using the data of real samples of the Munsell color order system measured at RIT [34]. CIE Cab* performed best followed by the current chroma predictor (Cz) to predict perceived Munsell Chroma. Finally, the current hue predictor (hz) gave the third best performance following the best, CAM16-UCS, and second best, CIELAB, to predict perceived Munsell Hue. These results are as expected because Munsell data were used to train both CIELAB and CAM16-UCS but not Jzazbz. The quantitative results for prediction of perceived Munsell Chroma and Hue are given in Table 2. The results showed that ICTCP performed worst overall in predicting perceptual correlates of the Munsell color system. The prediction results of Munsell Chroma and Hue for fixed Value (V=5) are plotted in different color spaces for visual comparison in Appendix B.1. More irregularities can again be observed in the green-blue region of ICTCP (see Appendix B.1), as was mentioned earlier for the MacAdam ellipses data shown in Fig. 3(a).

Another important property of a color space is hue linearity. All color spaces were also tested for prediction of three different iso-hue data sets including Hung & Berns [16], Ebner & Fairchild [17], and Xiao et al. [36], as well as the Munsell data. The quantitative results for hue linearity in terms of mean SD of the hue angle are given in Table 3. Results of the non-linearity of the blue hue (which is known for bad fit for many color spaces) for Hung & Berns data are also given in Table 3. The Hung & Berns data set which has had wide acceptance [4, 18, 26, 27] was also used as a reference for hue linearity while developing the current uniform color space. The results based on the Hung & Berns data showed that the two CIE uniform color spaces (CIELAB and CIELUV) and CAM16-UCS (giving the best performance to predict color difference data) have very large hue shifts in the blue region with SD values of 13.2, 6.8, and 9.9, respectively. These hue shifts can also be observed in Fig. 5 where the Hung & Berns data are plotted in six test spaces. Hue non-linear color spaces are not suitable for applications where hue linearity is important e.g., gamut mapping, image enhancement etc. The IPT color space gave the best performance, followed by Jzazbz and ICTCP for the Hung & Berns data. For prediction of the constant blue hue data, which is most critical, Jzazbz outperformed CIELAB, CIELUV, and CAM16-UCS. The Ebner & Fairchild data are also plotted in different color spaces for visual comparison in Appendix B.2. For the Ebner & Fairchild data, large hue shifts can again be observed in blue hues for CIELAB and CAM16-UCS.

Tables Icon

Table 3. Test performance of color spaces for grey convergence and hue linearity based on three different data sets. Values of standard deviation (SD) have units of degree and that of chroma-ratio are in percentage.

 figure: Fig. 5

Fig. 5 The Hung & Berns data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) proposed Jzazbz. The solid black lines are drawn based on linear orthogonal fitting whereas dashed black lines in the blue direction are drawn hypothetically to show the ideal case. The data points are color coded using corresponding sRGB primaries.

Download Full Size | PDF

When predicting the Xiao et al., data [36], CIELUV performed best overall followed by CIELAB (see Table 3). It was found that the Xiao et al., data set disagree with the other iso-hue data sets. The Xiao et al., data set is plotted for visual comparison in different color spaces along with orthogonal fitting lines in Appendix B.3. It can also be observed that the unique hue lines do not converge to a single point when plotted in CAM16-UCS and ICTCP. The overall performance of Jzazbz for the four hue linearity data sets indicates that although it does not always perform the best, it markedly outperformed CAM16-UCS and is not far away from the overall best (IPT).

The grey-scale convergence of test spaces was also examined by using the chroma-ratio metric which measures percent inter-dependence between luminance and chroma. Results for grey-scale convergence are given in Table 3. CIELAB and CIELUV have zero inter-dependence between luminance and chrominance, which is expected because CIE L* is a function of the luminance channel only. Luminance constancy of IPT, ICTCP and Jzazbz is also in a plausible range (chroma-ratio near to zero) but that of CAM16-UCS is the worst and may not be acceptable in some imaging applications.

It is encouraging that the current Jzazbz model gave overall better performance compared with other spaces in predicting the experimental data in all three dimensions and also including a wide-range luminance and wide color gamut data.

The CIE 1931 chromaticity scale and Rec.2020 gamut hull with a CIE D65 white point are plotted in Jzazbz as shown in Appendix B.4. From plots of Rec.2020 gamut hull (with a luminance of 10,000cd/m2) in three different planes, it can be seen that the lightness predictor of Jzazbz ranges from 0 to 1 whereas its chromatic correlates range between 0.5 and 0.5. The computational complexity of Jzazbz is much less than that of CAM16-UCS and slightly greater than other test spaces.

Note that two data sets (COMBVD ellipses, MacAdam ellipses) were used for testing the uniformity. The 25 ellipses in MacAdam data have constant luminance (48cd/m2), while luminance values for COMBVD ellipses range from 3 to 78cd/m2. Further, one of the three iso-hue data sets (the Hung & Berns data set) has constant luminance while the other two (the Ebner & Fairchild data set, and the Xiao et al. data set) have varying luminance. So the uniformity and hue linearity were tested using both constant luminance and varying luminance data sets. It is noted that ellipse data and iso-hue data considering colors with lightness L*>100were not available to the authors, and such data need to be generated in future experiments for further testing of the test spaces. Also note that the wide-range lightness data (SL1 and SL2) are based on achromatic (reflective or self-luminous) color samples and such data based on chromatic colors should also be generated in future experiments to test color spaces for chromatic self-luminous colors with lightnessL*>100.

7. Conclusions

A simple color space, Jzazbz, is proposed for color and imaging applications that include wide color gamut and high dynamic range. The space was tested by a number of different criteria using comprehensive sets of data. The performance of Jzazbz was compared with that of the five other selected color spaces, including those developed for typical dynamic range, and ICTCP which was developed for high dynamic range and wide gamut imaging applications. The results showed that Jzazbz gave the second best (CAM16-UCS was the best) performance for small color difference data sets and the best for experimental data corresponding to large color differences. The proposed color space gives the most accurate predictions for the MacAdam, the Munsell Value, and wide-range lightness data sets. It also gave reasonably accurate prediction of the hue linearity data sets. Considering its overall performance, it can be used with confidence for all imaging applications.

Future work includes further testing of color spaces by producing color difference and lightness difference data based on chromatic self-luminous colors with lightnessL*>100. The proposed color space should also be tested by applying it in different image processing applications. Development of a color appearance model based on the new uniform color space, Jzazbz, in order to predict color appearance attributes considering adaptation and surround conditions, is also envisaged as future work.

Appendix A

The reverse model of the proposed color space is given here. Three components (lightness (Jz), redness-greenness (az), yellowness-blueness (bz)) of Jzazbz uniform color space are inputs to the reverse model. The output of the reverse model will be absolute CIE tristimulus values (XD65,YD65,ZD65) relative to the white point of CIE standard illuminant D65 and the CIE 1931 standard colorimetric observer. The transformation equations are given below.

Iz=(Jz+d01+dd(Jz+d0))
[L'M'S']=[0.50.503.524000-4.0667080.5427080.1990761.096799-1.295875]1[Izazbz]
{L,M,S}=10000×(c1({L',M',S'})1/pc3({L',M',S'})1/pc2)1/n
[XD65'YD65'ZD65']=[0.414789720.5799990.0146480-0.20151001.1206490.0531008-0.01660080.2648000.6684799]1[LMS]
XD65=(XD65'+(b1)ZD65')/b
YD65=(YD65'+(g1)XD65)/g
ZD65=ZD65'
where b=1.15, g=0.66, c1=3424/212, c2=2413/27, c3=2392/27, n=2610/214,p=1.7×2523/25,d=0.56, and d0=1.6295499532821566×10-11.

Appendix B1

The Munsell data (V=5) [34] are plotted in six different color spaces in this Appendix (see Fig. 6).

 figure: Fig. 6

Fig. 6 Prediction results of Munsell data [34] for fixed Value (V=5) and varying Chroma and Hue; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The data points are color coded using correspinding sRGB perimaries.

Download Full Size | PDF

Appendix B2

The Ebner & Fairchild [17] data are plotted in six different color spaces in this Appendix (see Fig. 7).

 figure: Fig. 7

Fig. 7 The Ebner & Fairchild [17] data plotted in six different color spaces; (a) CIELAB, (b) 40CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The solid black lines are drawn based on linear orthogonal fitting. The data points are color coded using corresponding chromaticity and constant lightness (L*=60).

Download Full Size | PDF

Appendix B3

The Xiao et al. [36], data are plotted in six different color spaces in this Appendix (see Fig. 8).

 figure: Fig. 8

Fig. 8 The Xiao et al. [36], data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The red, green, and blue color codes represent corresponding unique hue while black represents unique yellow. The solid lines are drawn based on linear orthogonal fitting. The symbole × represents corresponding white point.

Download Full Size | PDF

Appendix B4

The CIE 1931 chromaticity scale and Rec.2020 gamut hull are plotted in Jzazbz space (see Fig. 9).

 figure: Fig. 9

Fig. 9 (a) The CIE 1931 chromaticity scale plotted in the Jzazbz. The Rec.2020 gamut hull (Y=10,000cd/m2) is plotted in Jzazbz with azbz, azJz, and bzJz planes shown in (b), (c), and (d), respectively. The white point corresponds to CIE standard illuminant D65.

Download Full Size | PDF

Funding

Huawei Technologies Co., Ltd., Shanghai, China (reference number: KH20161715) and International Color Consortium (ICC) color management research fund 2016 awarded to the first author (reference number: 2016080100017653).

References and links

1. M. Nilsson, “Ultra high definition video formats and standardization,” (BT Media and Broadcast Research Paper, Version 1.0, 2015).

2. S. Daly, T. Kunkel, X. Sun, S. Farrell, and P. Crum, “41.1: Distinguished paper: Viewer preferences for shadow, diffuse, specular, and emissive luminance limits of high dynamic range displays,” SID Symposium Digest of Technical Papers 44 (1), 563–566 (2013). [CrossRef]  

3. M. R. Pointer, “The gamut of real surface colors,” Color Res. Appl. 5(3), 145–155 (1980). [CrossRef]  

4. I. Lissner and P. Urban, “Toward a unified color space for perception-based image processing,” IEEE Trans. Image Process. 21(3), 1153–1168 (2012). [CrossRef]   [PubMed]  

5. M. Safdar, M. R. Luo, and X. Liu, “Performance Comparison of JPEG, JPEG 2000, and Newly Developed CSI-JPEG by Adopting Different Color Models,” Col. Res. Appl. 42(4), 460-473 (2017).

6. G. M. Johnson and M. D. Fairchild, “A top down description of S-CIELAB and CIEDE2000,” Color Res. Appl. 28(6), 425–435 (2003). [CrossRef]  

7. K. J. Leeming and P. Green, “Selecting significant colors from a complex image for image quality modeling,” Proc. SPIE 6059, 605907 (2006). [CrossRef]  

8. J. Kuang, G. M. Johnson, and M. D. Fairchild, “iCAM06: A refined image appearance model for HDR image rendering,” J. Vis. Commun. Image R. 18(5), 406–414 (2007). [CrossRef]  

9. Y. J. Kim and S. Park, “CIECAM02-UCS based evaluation of colorimetric characterization modeling for a liquid crystal display using a digital still camera,” Opt. Rev. 17(3), 152–158 (2010). [CrossRef]  

10. A. David, P. T. Fini, K. W. Houser, Y. Ohno, M. P. Royer, K. A. G. Smet, M. Wei, and L. Whitehead, “Development of the IES method for evaluating the color rendition of light sources,” Opt. Express 23(12), 15888–15906 (2015). [CrossRef]   [PubMed]  

11. Commission Internationale de l’Éclairage (CIE), International Lighting Vocabulary, http://eilv.cie.co.at/term/1369

12. Commission Internationale de l’Éclairage (CIE), Colorimetry, CIE Publication No. 15 (CIE Central Bureau, Vienna, Austria, 2004).

13. British Standards Institution (BSI), Method for calculation of small colour differences, BSI Tech. Rep. BS 6923 (BSI, 1988).

14. Commission Internationale de l’Éclairage (CIE), Industrial colour-difference evaluation, CIE Publication No. 116 (CIE Central Bureau, Vienna, Austria, 1995).

15. M. R. Luo, G. Cui, and B. Rigg, “The development of the CIE 2000 colour-difference formula: CIEDE2000,” Color Res. Appl. 26(4), 340–350 (2001). [CrossRef]  

16. P. C. Hung and R. S. Berns, “Determination of constant Hue Loci for a CRT gamut and their predictions using color appearance spaces,” Color Res. Appl. 20(5), 285–295 (1995). [CrossRef]  

17. F. Ebner and M. D. Fairchild, “Finding constant hue surfaces in color space,” Proc. SPIE 3300, 107–117 (1998). [CrossRef]  

18. F. Ebner and M. D. Fairchild, “Development and testing of a color space (IPT) with improved hue uniformity,” in Proceedings of the 6th Color and Imaging Conference, (Society for Imaging Science and Technology, 1998), pp. 8–13.

19. M. R. Luo, G. Cui, and C. Li, “Uniform colour spaces based on CIECAM02 colour appearance model,” Color Res. Appl. 31(4), 320–330 (2006). [CrossRef]  

20. C. J. Li, Z. Li, Z. Wang, Y. Xu, M. R. Luo, G. Cui, M. Melgosa, and M. R. Pointer, “A revision of CIECAM02 and its CAT and UCS,” in Proceedings of the 24th Color and Imaging Conference, (Society for Imaging Science and Technology, 2016), pp. 208–212.

21. M. Safdar, M. R. Luo, and G. Cui, “Investigating performance of uniform color spaces for high dynamic range and wide gamut color difference applications,” in Proceedings of the 24th Color and Imaging Conference, (Society for Imaging Science and Technology, 2016), pp. 88–93.

22. M. D. Fairchild and D. R. Wyble, “hdr-CIELAB and hdr-IPT: Simple models for describing the color of high-dynamic-range and wide-color-gamut images,” in Proceedings of the 18th Color and Imaging Conference, (Society for Imaging Science and Technology, 2010), pp. 322–326.

23. M. D. Fairchild and P. H. Chen, “Brightness, lightness, and specifying color in high-dynamic-range scenes and images,” Proc. SPIE 7867, 78670O (2011). [CrossRef]  

24. Society of Motion Picture & Television Engineers, “High dynamic range electro-optical transfer function of mastering reference displays,” Society of Motion Picture & Television Engineers (SMPTE), ST 2084, 1–14 (2014).

25. P. G. J. Barten, “Formula for the contrast sensitivity of the human eye,” Proc. SPIE 5294, 231–238 (2004). [CrossRef]  

26. J. Froehlich, T. Kunkel, R. Atkins, J. Pytlarz, S. Daly, A. Schilling, and B. Eberhardt, “Encoding Color Difference Signals for High Dynamic Range and Wide Gamut Imagery,” in Proceedings of the 23rd Color and Imaging Conference, (Society for Imaging Science and Technology, 2015), pp. 240–247.

27. Dolby, What is ICTCP-Introduction? White paper, version 7.1 (Dolby, United States, 2016).

28. P. A. García, R. Huertas, M. Melgosa, and G. Cui, “Measurement of the relationship between perceived and computed color differences,” J. Opt. Soc. Am. A 24(7), 1823–1829 (2007). [CrossRef]   [PubMed]  

29. R. S. Berns, D. H. Alman, L. Reniff, G. D. Snyder, and M. R. Balonon-Rosen, “Visual determination of suprathreshold color-difference tolerances using probit analysis,” Color Res. Appl. 16(5), 297–316 (1991). [CrossRef]  

30. K. Witt, “Geometric relations between scales of small colour differences,” Color Res. Appl. 24(2), 78–92 (1999). [CrossRef]  

31. D. H. Kim and J. H. Nobbs, “New weighting functions for the weighted CIELAB colour difference formula,” in Proceedings of the AIC Colour, (AIC, 1997), pp. 446–449.

32. M. R. Luo and B. Rigg, “Chromaticity-discrimination ellipses for surface colours,” Color Res. Appl. 11(1), 25–42 (1986). [CrossRef]  

33. D. L. MacAdam, “Uniform color scales,” J. Opt. Soc. Am. 64(12), 1691–1702 (1974). [CrossRef]   [PubMed]  

34. RIT Munsell Color Science Laboratory, Munsell renotation data, https://www.rit.edu/cos/colorscience/rc_munsell_renotation.php.

35. D. L. MacAdam, “Visual sensitivities to color differences in daylight,” J. Opt. Soc. Am. 32(5), 247–274 (1942). [CrossRef]  

36. K. Xiao, S. Wuerger, C. Fu, and D. Karatzas, “Unique hue data for colour appearance models. Part I: Loci of unique hues and hue uniformity,” Color Res. Appl. 36(5), 316–323 (2011). [CrossRef]  

37. Commission Internationale de l’Éclairage (CIE), A colour appearance model for colour management systems: CIECAM02, CIE Publication No. 159 (CIE Central Bureau, Vienna, Austria, 2004).

38. G. Cui, M. R. Luo, B. Rigg, G. Roesler, and K. Witt, “Uniform colour spaces based on the DIN99 colour-difference formula,” Color Res. Appl. 27(4), 282–290 (2002). [CrossRef]  

39. G. Kuehni, “Towards an improved uniform color space,” Color Res. Appl. 24(4), 253–265 (1999). [CrossRef]  

40. M. Safdar, G. Cui, Y. J. Kim, and M. R. Luo, “MATLAB code for forward and reverse models of Jzazbz uniform color space,” figshare. [Retrieved: June 21, 2017] https://doi.org/10.6084/m9.figshare.5132461.

41. S. M. Newhall, D. Nickerson, and D. B. Judd, “Final report of the O.S.A. subcommittee on the spacing of the Munsell colors,” J. Opt. Soc. Am. 33(7), 385–418 (1943). [CrossRef]  

Supplementary Material (1)

NameDescription
Code 1       MATLAB code for forward and reverse models of JzAzBz uniform color space

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Plots of the Hung & Berns constant hue data; (a) hue linearity using the structure similar to the Dolby model but re-optimizing the coeffients of matrices in Eq. (2) and Eq. (4), and the power factor in Eq. (3) [21], (b) hue linearity after extending the model using Eq. (5) and re-optimizing with b=1.16 , and (c) hue linearity after extending the model using Eq. (6) and re-optimizing with b=1.15 and g=0.66 . The solid black lines are drawn based on linear orthogonal fitting whereas the dashed black lines in the blue direction are drawn hypothetically to show the desired linearity.
Fig. 2
Fig. 2 The COMBVD ellipses data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP,, and (f) the proposed Jzazbz.
Fig. 3
Fig. 3 The MacAdam ellipses (10 times amplified) plotted in: (a) ICTCP, and (b) the proposed Jzazbz. The colored background represents the Rec.2020 gamut hull with a peak luminance of 1000 cd / m 2 and a white point corresponding to CIE standard illuminant D65.
Fig. 4
Fig. 4 Plots of lightness predictors; (a) Prediction of SL1 data by different lightness predictors, (b) Prediction of SL2 data by different lightness predictors, (c) Different lightness predictors plotted against CIE L * , and (d) Prediction of Munsell Value by different lightness predictors. All the lightness predictors are scaled to the range of CIE L * .
Fig. 5
Fig. 5 The Hung & Berns data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) proposed Jzazbz. The solid black lines are drawn based on linear orthogonal fitting whereas dashed black lines in the blue direction are drawn hypothetically to show the ideal case. The data points are color coded using corresponding sRGB primaries.
Fig. 6
Fig. 6 Prediction results of Munsell data [34] for fixed Value ( V=5 ) and varying Chroma and Hue; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The data points are color coded using correspinding sRGB perimaries.
Fig. 7
Fig. 7 The Ebner & Fairchild [17] data plotted in six different color spaces; (a) CIELAB, (b) 40CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The solid black lines are drawn based on linear orthogonal fitting. The data points are color coded using corresponding chromaticity and constant lightness ( L * =60 ).
Fig. 8
Fig. 8 The Xiao et al. [36], data plotted in six different color spaces; (a) CIELAB, (b) CIELUV, (c) CAM16-UCS, (d) IPT, (e) ICTCP, and (f) the proposed Jzazbz. The red, green, and blue color codes represent corresponding unique hue while black represents unique yellow. The solid lines are drawn based on linear orthogonal fitting. The symbole × represents corresponding white point.
Fig. 9
Fig. 9 (a) The CIE 1931 chromaticity scale plotted in the Jzazbz. The Rec.2020 gamut hull ( Y=10,000 cd / m 2 ) is plotted in Jzazbz with a z b z , a z J z , and b z J z planes shown in (b), (c), and (d), respectively. The white point corresponds to CIE standard illuminant D65.

Tables (3)

Tables Icon

Table 1 Details of different experimental data sets and corresponding appearance model parameters.

Tables Icon

Table 2 Results for test color spaces to predict different experimental data sets presented in STRESS units.

Tables Icon

Table 3 Test performance of color spaces for grey convergence and hue linearity based on three different data sets. Values of standard deviation (SD) have units of degree and that of chroma-ratio are in percentage.

Equations (23)

Equations on this page are rendered with MathJax. Learn more.

Chromaratio=100 3 C w C r + C g + C b
[ L M S ]=[ α 1,1 α 1,2 1 α 1,1 α 1,2 α 2,1 α 2,2 1 α 2,1 α 2,2 α 3,1 α 3,2 1 α 3,1 α 3,2 ][ X D65 Y D65 Z D65 ]
{ L ' , M ' , S ' }= ( c 1 + c 2 ( { L,M,S } 10000 ) n 1+ c 3 ( { L,M,S } 10000 ) n ) p
[ I z a z b z ]=[ ω 1,1 ω 1,2 1 ω 1,1 ω 1,2 ω 2,1 ω 2,2 ω 2,1 ω 2,2 ω 3,1 ω 3,2 ω 3,1 ω 3,2 ][ L ' M ' S ' ]
X D65 ' =b X D65 (b1) Z D65
[ X D65 ' Y D65 ' ]=[ b X D65 g Y D65 ][ (b1) Z D65 (g1) X D65 ]
J z = (1+d) I z 1+d I z
[ X D65 ' Y D65 ' ]=[ b X D65 g Y D65 ][ (b1) Z D65 (g1) X D65 ]
[ L M S ]=[ 0.41478972 0.579999 0.0146480 -0.2015100 1.120649 0.0531008 -0.0166008 0.264800 0.6684799 ][ X D65 ' Y D65 ' Z D65 ]
{ L' M' S' }= ( ( c 1 + c 2 ( { L M S } 10000 ) n ) ( 1+ c 3 ( { L M S } 10000 ) n ) ) p
[ I z a z b z ]=[ 0.5 0.5 0 3.524000 -4.066708 0.542708 0.199076 1.096799 -1.295875 ][ L' M' S' ]
J z = (1+d) I z 1+d I z d 0
C z = ( a z ) 2 + ( b z ) 2
h z =arctan( b z a z )
Δ E z = ( Δ J z ) 2 + ( Δ C z ) 2 + ( Δ H z ) 2
Y=1.2219V0.23111 V 2 +0.23951 V 3 0.021009 V 4 0.0008404 V 5
I z =( J z + d 0 1+dd( J z + d 0 ) )
[ L' M' S' ]= [ 0.5 0.5 0 3.524000 -4.066708 0.542708 0.199076 1.096799 -1.295875 ] 1 [ I z a z b z ]
{ L,M,S }=10000× ( c 1 ( { L ' , M ' , S ' } ) 1/p c 3 ( { L ' , M ' , S ' } ) 1/p c 2 ) 1/n
[ X D65 ' Y D65 ' Z D65 ' ]= [ 0.41478972 0.579999 0.0146480 -0.2015100 1.120649 0.0531008 -0.0166008 0.264800 0.6684799 ] 1 [ L M S ]
X D65 =( X D65 ' +( b1 ) Z D65 ' )/b
Y D65 =( Y D65 ' +( g1 ) X D65 )/g
Z D65 = Z D65 '
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.