Login  Register

Re: grayscale displays and human vision

Posted by Harry Parker on Jul 07, 2006; 8:23pm
URL: http://imagej.273.s1.nabble.com/grayscale-displays-and-human-vision-tp3702214p3702228.html

Hi John,

I also have an interest in "High Dynamic Range" (HDR) and "Extended Dynamic Range" (EDR) imaging.

ImageJ uses LookupTables (LUTs) to map ranges of intensities down to the 8 bit values that common video cards and digital monitors can understand.

You can adjust the LUT values with ImageJ commands in the menu. That's what happens when you use the menu commands such as  Image->Adjust->Brightness/Contrast.

That way you can stretch out any particular portion of the 4096 range to the 256 element output range. If you are trying to visualize subtle intensity differences, that is all you need to do.

However, displaying an HDR image "accurately" is trickier.

Human vision can detect a brightness difference of about 1 - 2%, so standard monitors provide a grayscale that has each intensity a roughly equal 1% greater than the last intensity. Note that these are equal ratio differences, not equal differences. The brightness control on your monitor affects the range, and the contrast affects the black offset.

Note the difference in responce between your camera and your display: While your camera has a linear response to light, your display does not have a linear response to its input signal. Instead, it is mapped to be have a response tuned to human visual difference detection.  

So to preserve the "correct" visual presentation of intensities from your camera requires an inverse nonlinear correction. The ImageJ menu command Process->Math->Gamma... set to about 0.45 and applied to your 12 bit image can provide an approximately correct correction, but to do it accurately requires calibrating your monitor so you can calculate the correct correction factors.

Images from standard color digital cameras are already encoded with the same nonlinear "Standard RGB" (sRGB) scale, but scientific camera with extended ranges, such as yours, are not.

Here are some links for further info:

Weber's Law of Just Noticeable Difference

Non-Linear Scale-Factor Methods

 Gamma FAQ - by Poynton

Monitor calibration & gamma

Is 24 bits enought ?

Tone mapping - Wikipedia, the free encyclopedia

Hope this helps.


--  
Harry Parker  
Senior Systems Engineer  
Dialog Imaging Systems, Inc.

----- Original Message ----
From: John Oreopoulos <[hidden email]>
To: [hidden email]
Sent: Thursday, July 6, 2006 2:01:19 PM
Subject: grayscale displays and human vision

Hello,

I am new to Image J and I have a very general question about how  
grayscale images are displayed on a computer monitor.  I use a 12-bit  
monochrome CCD camera to capture fluorescent microscope images and  
save as .tiff files.  When I open my images in ImageJ, the image is  
displayed as an 8-bit (0-255) on the monitor.  When I hover over a  
pixel in the image with the mouse pointer, the 12-bit value (0-4095)  
that was captured by the camera is listed in the ImageJ toolbar.  I  
looked at the ImageJ documentation and read up a little bit on  
digital displays, and it seems that all standard computer monitors  
will display grayscale images in 8-bit only.  Why is this so?  Is it  
because of hardware limitations and costs?  Is it because the human  
eye can only detect 256 discrete shades of gray?  If this is not the  
case, then is there not some loss of information in the visual image  
when it gets displayed at 8-bit?  Am I losing some of the "true"  
contrast when I look at my 12-bit images in ImageJ?
I did a google search on "12-bit grayscale displays" and found some  
sites that sell special X-ray and MRI monitors with 12-bit or even 16-
bit grayscale resolution.  If these kinds of monitors exist, then  
this means the human eye can infact detect more than 256 shades of  
gray, correct?

Thank you in advance for any replies!

John O