Hi Andy,
The definition of variance is usually 1/(N-1)SUM ( x - mean)^2
in you case N=width*height
And you can easily get the variance and the mean from the
image histogram. Note, that the histogram is ROI dependent.
best regards
Dimiter Prodanov
>------------------------------
>
>Date: Wed, 14 Jun 2006 17:03:54 +0200
>From: Andy Weller <
[hidden email]>
>Subject: Autofocusing and image grey level mean
>
>Dear all,
>
>Apologise for the long-winded introduction, but I'll get to my point...
>
>Based on the conclusions drawn from Sun et al. 2004 (Autofocusing in
>Computer Microscopy: Selecting the Optimal Focus Algorithm, MICROSCOPY
>RESEARCH AND TECHNIQUE, 65, 139â149) I would like to run the 'Normalized
>Variance' algorithm for autofocusing in microscopy.
>
>My interpretation of the algorithm is as follows (please tell me if I am
>wrong?!?):
>
>a = 1 / ([image height] x [image width] x [image grey level mean
>intensity])
>
>b = ([pixel grey level] - [image grey level mean intensity])^2
>
>Normalized Variance = a x b
>
>(Where the best focus is defined as having the maximum value; this
>decreases as defocus increases.)
>
>My questions therefore are as follows:
>
>By clicking Analyze > Measure I am given a 'Mean' value. I presume this
>is the image's grey level mean intensity if no ROI is selected? If it is
>not, how do I find and store this value so I use it to determine the
>'Normalized Variance'?
>
>Does anybody have any suggestions regarding a different/better
>autofocusing approach?
>
>Thanks, Andy
>
>------------------------------