Computing Science, School of

Receive updates for this collection

The Rehabilitation of MaxRGB

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2010-11
Abstract: 

The poor performance of the MaxRGB illuminationestimation method is often used in the literature as a foil when promoting some new illumination-estimation method. However, the results presented here show that in fact MaxRGB works surprisingly well when tested on a new dataset of 105 high dynamic range images, and also better than previously reported when some simple pre-processing is applied to the images of the standard 321 image set [1]. The HDR images in the dataset for color constancy research were constructed in the standard way from multiple exposures of the same scene. The color of the scene illumination was determined by photographing an extra HDR image of the scene with 4 Gretag Macbeth mini Colorcheckers at 45 degrees relative to one another placed in it. With preprocessing, MaxRGB’s performance is statistically equivalent to that of Color by Correlation [2] and statistically superior to that of the Greyedge [3] algorithm on the 321 set (null hypothesis rejected at the 5% significance level). It also performs as well as Greyedge on the HDR set. These results demonstrate that MaxRGB is far more effective than it has been reputed to be so long as it is applied to image data that encodes the full dynamic range of the original scene.

Document type: 
Conference presentation
File(s): 

Gaussian-Metamer-Based Prediction of Colour Stimulus Change under Illuminant Change

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2011-06
Abstract: 

Predicting how the LMS cone response to light reflected from a surface changes with changing lighting conditions is a long-standing and important problem. It arises in white balancing digital imagery, and when re-rendering printed material for viewing under a second illuminant (e.g., changing from D65 to F11). Von Kries scaling is perhaps the most common approach to predicting what LMS cone response will arise under a second illuminant given the LMS under a first illuminant. We approach this prediction problem, instead, from the perspective of Logvinenko’s new colour atlas, and obtain better results than with von Kries scaling.

Document type: 
Conference presentation
File(s): 

Intersecting Color Manifolds

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2011-11
Abstract: 

Logvinenko’s color atlas theory provides a structure in which a complete set of color-equivalent material and illumination pairs can be generated to match any given input RGB color. In chromaticity space, the set of such pairs forms a 2-dimensional manifold embedded in a 4-dimensional space. For singleilluminant scenes, the illumination for different input RGB values must be contained in all the corresponding manifolds. The proposed illumination-estimation method estimates the scene illumination based on calculating the intersection of the illuminant components of the respective manifolds through a Hough-like voting process. Overall, the performance on the two datasets for which camera sensitivity functions are available is comparable to existing methods. The advantage of the formulating the illumination-estimation in terms of manifold intersection is that it expresses the constraints provided by each available RGB measurement within a sound theoretical foundation.

Document type: 
Conference presentation
File(s): 

XYZ to ADL: Calculating Logvinenko's Object Color Coordinates

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2010-11
Abstract: 

Recently Logvinenko introduced a new objectcolor space, establishing a complete color atlas that is invariant to illumination [2]. However, the existing implementation for calculating the proposed color descriptors is computationally expensive and does not work for all types of illuminants. A new algorithm is presented that allows for an efficient calculation of Logvinenko’s color descriptors for large data sets and a wide variety of illuminants.

Document type: 
Conference presentation
File(s): 

Metamer Mismatch Volumes

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2012-05
Abstract: 

A new algorithm for evaluating metamer mismatch volumes is introduced. Unlike previous methods, the proposed method places no restrictions on the set of possible object reflectance spectra. Such restrictions lead to approximate solutions for the mismatch volume. The new method precisely characterizes the volume in all circumstances.

Document type: 
Conference presentation
File(s): 

Representing Outliers for Improved Multi-Spectral Data Reduction

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2012-05
Abstract: 

Large multi-spectral datasets such as those created by multi-spectral images require a lot of data storage. Compression of these data is therefore an important problem. A common approach is to use principal components analysis (PCA) as a way of reducing the data requirements as part of a lossy compression strategy. In this paper, we employ the fast MCD (Minimum Covariance Determinant) algorithm, as a highly robust estimator of multivariate mean and covariance, to detect outlier spectra in a multi-spectral image. We then show that by removing the outliers from the main dataset, the performance of PCA in spectral compression significantly increases. However, since outlier spectra are a part of the image, they cannot simply be ignored. Our strategy is to cluster the outliers into a small number of groups and then compress each group separately using its own cluster-specific PCAderived bases. Overall, we show that significantly better compression can be achieved with this approach.

Document type: 
Conference presentation
File(s): 

Metamer Mismatching in Practice versus Theory

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2016-01
Abstract: 

Metamer mismatching (the phenomenon that two objects matching in color under one illuminant may not match under a different illuminant) potentially has important consequences for color perception. Logvinenko et al. [PLoS ONE 10, e0135029 (2015)] show that in theory the extent of metamer mismatching can be very significant. This paper examines metamer mismatching in practice by computing the volumes of the empirical metamer mismatch bodies and comparing them to the volumes of the theoretical mismatch bodies. A set of more than 25 million unique reflectance spectra is assembled using datasets from several sources. For a given color signal (e.g., CIE XYZ) recorded under a given first illuminant, its empirical metamer mismatch body for a change to a second illuminant is computed as follows: the reflectances having the same color signal when lit by the first illuminant (i.e., reflect metameric light) are computationally relit by the second illuminant, and the convex hull of the resulting color signals then defines the empirical metamer mismatch body. The volume of these bodies is shown to vary systematically with Munsell value and chroma. The empirical mismatch bodies are compared to the theoretical mismatch bodies computed using the algorithm of Logvinenko et al. [IEEE Trans. Image Process. 23, 34 (2014)]. There are three key findings: (1) the empirical bodies are found to be substantially smaller than the theoretical ones; (2) the sizes of both the empirical and theoretical bodies show a systematic variation with Munsell value and chroma; and (3) applied to the problem of color-signal prediction, the centroid of the empirical metamer mismatch body is shown to be a better predictor of what a given color signal might become under a specified illuminant than state-of-the-art methods.

Document type: 
Article
File(s): 

Computational Color Prediction versus Least-Dissimilar Matching

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2018-03
Abstract: 

The performance of color prediction methods CIECAM02, KSM2, Waypoint, Best Linear, MMV center, and relit color signal are compared in terms of how well they explain Logvinenko & Tokunaga’s asymmetric color matching results (“Colour Constancy as Measured by Least Dissimilar Matching,” Seeing and Perceiving, vol. 24, no. 5, pp. 407- 452, 2011). In their experiment, 4 observers were asked to determine (3 repeats) for a given Munsell paper under a test illuminant which of 22 other Munsell papers was the least-dissimilar under a match illuminant. Their use of “least-dissimilar” as opposed to “matching” is an important aspect of their experiment. Their results raise several questions. Question 1: Are observers choosing the original Munsell paper under the match illuminant? If they are, then the average (over 12 matches) color signal (i.e., cone LMS or CIE XYZ) made under a given illuminant condition should correspond to that of the test paper’s color signal under the match illuminant. Computation shows that the mean color signal of the matched papers is close to the color signal of the physically identical paper under the match illuminant. Question 2: Which color prediction method most closely predicts the observers’ average leastdissimilar match? Question 3: Given the variability between observers, how do individual observers compare to the computational methods in predicting the average observer matches? A leave-one-observer-out comparison shows that individual observers, somewhat surprisingly, predict the average matches of the remaining observers better than any of the above color prediction methods.

Document type: 
Article
File(s): 

Reducing Worst-Case Illumination Estimates for Better Automatic White Balance

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2012-11
Abstract: 

Automatic white balancing works quite well on average, but seriously fails some of the time. These failures lead to completely unacceptable images. Can the number, or severity, of these failures be reduced, perhaps at the expense of slightly poorer white balancing on average, with the overall goal being to increase the overall acceptability of a collection of images? Since the main source of error in automatic white balancing arises from misidentifying the overall scene illuminant, a new illuminationestimation algorithm is presented that minimizes the high percentile error of its estimates. The algorithm combines illumination estimates from standard existing algorithms and chromaticity gamut characteristics of the image as features in a feature space. Illuminant chromaticities are quantized into chromaticity bins. Given a test image of a real scene, its feature vector is computed, and for each chromaticity bin, the probability of the illuminant chromaticity falling into a chromaticity bin given the feature vector is estimated. The probability estimation is based on Loftsgaarden-Quesenberry multivariate density function estimation over the feature vectors derived from a set of synthetic training images. Once the probability distribution estimate for a given chromaticity channel is known, the smallest interval that is likely to contain the right answer with a desired probability (i.e., the smallest chromaticity interval whose sum of probabilities is greater or equal to the desired probability) is chosen. The point in the middle of that interval is then reported as the chromaticity of the illuminant. Testing on a dataset of real images shows that the error at the 90th and 98th percentile ranges can be reduced by roughly half, with minimal impact on the mean error.

Document type: 
Conference presentation
File(s): 

Rank-Based Illumination Estimation

Peer reviewed: 
Yes, item is peer reviewed.
Date created: 
2013-11
Abstract: 

A new two-stage illumination estimation method based on the concept of rank is presented. The method first estimates the illuminant locally in subwindows using a ranking of digital counts in each color channel and then combines local subwindow estimates again based on a ranking of the local estimates. The proposed method unifies the MaxRGB and Grayworld methods. Despite its simplicity, the performance of the method is found to be competitive with other state-of-the art methods for estimating the chromaticity of the overall scene illumination.

Document type: 
Conference presentation
File(s):