Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Studies on determining equiluminant colors

Not Accessible

Your library or personal account may give you access

Abstract

Traditionally, heterochromatic flicker photometry has been widely accepted as the method of choice for equating the relative luminance of two colors. Recently, a variety of additional methods have been used in studies of the spatial and temporal properties of chromatic mechanisms. Because of the importance of accurate luminance equation, we have compared results from flicker photometry, minimum apparent motion, and minimum contrast in the same subjects using the same equipment (a computer-generated raster graphics system). Field size (2°) and mean luminance (27.4 cd/m2) were held constant. The red and green guns of a Mitsubishi RGB monitor generated the colors to be compared. As we reported earlier,1 significant method-dependent differences are found within a given subject. The direction and magnitude of such differences are idiosyncratic. Here we present data showing that some, although not all, of the subject and method dependent variations can be attributed to temporal factors. In particular, the difference between the red-green ratios determined by heterochromatic flicker and photometry and those determined by minimum contrast sensitivity measures may be substantially reduced for some subjects by equating temporal frequency for the two methods.

© 1988 Optical Society of America

PDF Article
More Like This
Temporal integration of equiluminous color in megalopapilla and hypoplasia

Vincent A. Billock, Scott S. Grigsby, and P. Ewen King-Smith
TuF6 OSA Annual Meeting (FIO) 1990

Spectral saturation and its prediction

Kenneth Fuld
ML6 OSA Annual Meeting (FIO) 1988

Critical flicker frequency of equiluminous red-green alternation across the human visual field

Antti Raninen and Jyrki Rovamo
MR37 OSA Annual Meeting (FIO) 1988

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.