|
Dear all,
Apologise for the long-winded introduction, but I'll get to my point...
Based on the conclusions drawn from Sun et al. 2004 (Autofocusing in
Computer Microscopy: Selecting the Optimal Focus Algorithm, MICROSCOPY
RESEARCH AND TECHNIQUE, 65, 139–149) I would like to run the 'Normalized
Variance' algorithm for autofocusing in microscopy.
My interpretation of the algorithm is as follows (please tell me if I am
wrong?!?):
a = 1 / ([image height] x [image width] x [image grey level mean
intensity])
b = ([pixel grey level] - [image grey level mean intensity])^2
Normalized Variance = a x b
(Where the best focus is defined as having the maximum value; this
decreases as defocus increases.)
My questions therefore are as follows:
By clicking Analyze > Measure I am given a 'Mean' value. I presume this
is the image's grey level mean intensity if no ROI is selected? If it is
not, how do I find and store this value so I use it to determine the
'Normalized Variance'?
Does anybody have any suggestions regarding a different/better
autofocusing approach?
Thanks, Andy
|