## About Summit

# Computing Science, School of

Receive updates for this collection## Independent Component Analysis and Nonnegative Linear Model Analysis of Illuminant and Reflectance Spectra

Principal Component Analysis (PCA), Independent Component Analysis (ICA), Non-Negative Matrix Factorization (NNMF) and Non-Negative Independent Component Analysis (NNICA) are all techniques that can be used to compute basis vectors for finite-dimensional models of spectra. The two non-negative techniques turn out to be especially interesting because the pseudo-inverse of their basis vectors is also close to being non-negative. This means that after truncating any negative components of the pseudo-inverse vectors to zero, the resulting vectors become physically realizable sensors functions whose outputs map directly to the appropriate finite-dimensional weighting coefficients in terms of the associated (NNMF or NNICA) basis. Experiments show that truncating the negative values incurs only a very slight performance penalty in terms of the accuracy with which the input spectrum can be approximated using a finite-dimensional model.

- Login to post comments

## A Basis for Cones

Why do the human cones have the spectral sensitivities they do? We hypothesize that they may have evolved to their present form because their sensitivities are optimal in terms of their ability to recover the spectrum of incident light. As evidence in favor of this hypothesis, we compare the accuracy with which the incoming spectrum can be approximated by a three-dimensional linear model based on the cone responses and compare this to the optimal approximations defined by models based on principal components analysis, independent component analysis, non-negative matrix factorization and non-negative independent component analysis. We introduce a new method of reconstructing spectra from the cone responses and show that the cones are almost as good as these optimal methods in estimating the spectrum.

- Login to post comments

## Multispectral Colour Constancy

Does extending the number of channels from the 3 RGB sensors of a colour camera to 6 or 9 using a multispectral camera enhance the performance of illumination-estimation algorithms? Experiments are conducted with a variety of colour constancy algorithms (Maloney-Wandell, Chromagenic, Greyworld, Max RGB, and a Maloney-Wandell extension) measuring their performance as a function of the number of sensor channels. Although minor improvements were found with 6 channels, overall the results indicate that multispectral imagery is unlikely to lead to substantially better illumination-estimation performance.

- Login to post comments

## Removing Outliers in Illumination Estimation

A method of outlier detection is proposed as a way of improving illumination-estimation performance in general, and for scenes with multiple sources of illumination in particular. Based on random sample consensus (RANSAC), the proposed method (i) makes estimates of the illumination chromaticity from multiple, randomly sampled sub-images of the input image; (ii) fits a model to the estimates; (iii) makes further estimates, which are classified as useful or not on the basis of the initial model; (iv) and produces a final estimate based on the ones classified as being useful. Tests on the Gehler colorchecker set of 568 images demonstrate that the proposed method works well, improves upon the performance of the base algorithm it uses for obtaining the sub-image estimates, and can roughly identify the image areas corresponding to different scene illuminants.

- Login to post comments

## A Parallel-Process Model of Mental Rotation

It is argued that some of the phenomena identified with analog processes by Shepard can be understood as resulting from a parallel-process algorithm running on a processor having many individual processing elements and a restricted communication structure. In particular, an algorithm has been developed and implemented which models human behavior on Shepard's object rotation and comparison task. The algorithm exhibits computation times which increase linearly with the angle of rotation. Shepard found a similar linear function in his experiments with human subjects. In addition, the intermediate states of the computation are such that if the rotation process were to be interrupted at any point, the object representation would correspond to that of the actual object at a position along the rotation trajectory. The computational model presented here is governed by three constraining assumptions: (a) that it be parallel; (b) that the communication between processors be restricted to immediate neighbors; (c) that the object representation be distributed across a large fraction of the available processors. A method of choosing the correct axis of rotation is also presented.

- Login to post comments

## Synthesis of Acoustic Timbres using Principal Component Analysis

We have developed an alternate method of representing harmonic amplitude envelopes of musical instrument sounds using principal component analysis. Statistical analysis reveals considerable correlation between the harmonic amplitude values at different time positions in the envelopes. This correlation is exploited in order to reduce the dimensionality of envelope specification. It was found that two or three parameters provide a reasonable approximation to the different harmonic envelope curves present in musical instrument sounds. T he representation is suited for the development of high-level control mechanisms for manipulating the timbre of resynthesized harmonic sounds.

- Login to post comments

## Color from Black and White

Color constancy can be achieved by analyzing the chromatic aberration in an image. Chromatic aberration spatially separates light of different wavelengths and this allows the spectral power distribution of the light to be extracted. This is more information about the light than is registered by the cones of the human visual system or by a color television camera; and, using it, we show how color constancy, the separation of reflectance from illumination, can be achieved. As examples, we consider grey-level images of (a) a colored dot under unknown illumination, and (b) an edge between two differently colored regions under unknown illumination. Our first result is that in principle we can determine completely the spectral power distribution of the reflected light from the dot or, in the case of the color edge, the difference in the spectral power distributions of the light from the two regions. By employing a finite-dimensional linear model of illumination and surface reflectance, we obtain our second result, which is that the spectrum of the reflected light can be uniquely decomposed into a component due to the illuminant and another component due to the surface reflectance. This decomposition provides the complete spectral reflectance function, and hence color, of the surface as well as the spectral power distribution of the illuminant. Up to the limit of the accuracy of the finite-dimensional model, this effectively solves the color constancy problem.

- Login to post comments

## Color Constancy from Mutual Reflection

Mutual reflection occurs when light reflected from one surface illuminates a second surface. In this situation, the color of one or both surfaces can be modified by a *color-bleeding* effect. In this article we examine how sensor values (e.g., RGB values) are modified in the mutual reflection region and show that a good approximation of the surface spectral reflectance function for each surface can be recovered by using the extra information from mutual reflection. Thus color constancy results from an examination of mutual reflection. Use is made of finite dimensional linear models for ambient illumination and for surface spectral reflectance. If *m* and *n* are the number of basis functions required to model illumination and surface spectral reflectance respectively, then we find that the number of different sensor classes *p* must satisfy the condition *p*≥(2 *n*+*m*)/3. If we use three basis functions to model illumination and three basis functions to model surface spectral reflectance, then only three classes of sensors are required to carry out the algorithm. Results are presented showing a small increase in error over the error inherent in the underlying finite dimension models.

- Login to post comments

## Natural Metamers

Given only a color camera's RGB measurement of a complete color signal spectrum, how can the spectrum be estimated? We propose and test a new method that answers this question and recovers an approximating spectrum. Although this approximation has intrinsic interest, our main focus is on using it to generate tristimulus values for color reproduction. In essence, this provides a new method of converting color camera signals to tristimulus coordinates, because a spectrum defines a unique point in tristimulus coordinates. Color reproduction is founded on producing spectra that are metamers to those appearing in the original scene. Once a spectrum's tristimulus coordinates are known, generating a metamer is a well defined problem. Unfortunately, most color cameras cannot produce the necessary tristimulus coordinates directly because their color separation filters are not related by a linear transformation to the human color-matching functions. Color cameras are more likely to reproduce colors that look correct to the camera than to a human observer. Conversion from camera RGB triples to tristimulus values will always involve some type of estimation procedure unless cameras are redesigned. We compare the accuracy of our conversion strategy to that of one based on Horn's work on the exact reproduction of colored images. Our new method relies on expressing the color signal spectrum in terms of a linear combination of basis functions. The results show that a principal component analysis in color-signal space yields the best basis for our purposes, since using it leads to the most “natural” color signal spectrum that is statistically likely to have generated a given camera signal.

- Login to post comments

## Experiential Reasoning

At the time I developed Whisper in 1976 1 encountered extremely skeptical---often downright hostile---audiences and as a result I (mistakenly) published only an expurgated account of my thoughts about analog representations. 1 I am delighted that times have changed a bit and this symposium seems like an appropriate forum for a little speculation and re-evaluation.

- Login to post comments