Hello,
I have a time series of 2D images. The average intensity of each image varies slightly from the others, probably because of small changes in the distance between the microscope's objective and the imaged plane. The end result is that when you run the time series it flickers. Can anyone suggest a good way to normalize the intensities, so that the average intensity throughout the time series is constant? Thanks, Kul Kul Karcz Corporate Strategic Research Resource Sciences Laboratory ExxonMobil Research and Engineering Co. |
Dear Kul,
As differences in light intensities during image recording will affect both the background and the peak itensities, I suggest you to normalize on the background from all images first, i.e generating the normalizing factor for each image based on a reference image in order that all images then have the same background. It probably makes sense to use the image with the highest background as reference as then you won't run into the risk of generating overexposed images as the normalization factors will be smaller than 1. The image with the highest dynamic range also might be a good choice if you can determine that easily. This will preserve the full dynamics of your image stack. Can you do a calibration with - eg coloured photobleaching resistant beads? Best regards, Wolfgang -------- Original-Nachricht -------- Datum: Fri, 23 Mar 2007 14:20:01 -0400 Von: Kul Karcz <[hidden email]> An: [hidden email] Betreff: Normalizing intensities > Hello, > I have a time series of 2D images. The average intensity of each image > varies slightly from the others, probably because of small changes in the > distance between the microscope's objective and the imaged plane. The end > result is that when you run the time series it flickers. > Can anyone suggest a good way to normalize the intensities, so that the > average intensity throughout the time series is constant? > Thanks, > Kul > > Kul Karcz > Corporate Strategic Research > Resource Sciences Laboratory > ExxonMobil Research and Engineering Co. -- "Feel free" - 10 GB Mailbox, 100 FreeSMS/Monat ... Jetzt GMX TopMail testen: www.gmx.net/de/go/mailfooter/topmail-out |
In reply to this post by Kul Karcz
I would argue that in flourescence images the background might remain constant even if illumination varies. Thus for fluor I would suggest normalizing to the max value (if your fluor sources don't vary much with time), or perhaps the mean intensity of a bright but "biologically inert" region of the image.
Bill -----Original Message----- From: Wolfgang Schechinger <[hidden email]> Subj: Re: Normalizing intensities Date: Sat Mar 24, 2007 7:36 am Size: 1K To: [hidden email] Dear Kul, As differences in light intensities during image recording will affect both the background and the peak itensities, I suggest you to normalize on the background from all images first, i.e generating the normalizing factor for each image based on a reference image in order that all images then have the same background. It probably makes sense to use the image with the highest background as reference as then you won't run into the risk of generating overexposed images as the normalization factors will be smaller than 1. The image with the highest dynamic range also might be a good choice if you can determine that easily. This will preserve the full dynamics of your image stack. Can you do a calibration with - eg coloured photobleaching resistant beads? Best regards, Wolfgang -------- Original-Nachricht -------- Datum: Fri, 23 Mar 2007 14:20:01 -0400 Von: Kul Karcz <[hidden email]> An: [hidden email] Betreff: Normalizing intensities > Hello, > I have a time series of 2D images. The average intensity of each image > varies slightly from the others, probably because of small changes in the > distance between the microscope's objective and the imaged plane. The end > result is that when you run the time series it flickers. > Can anyone suggest a good way to normalize the intensities, so that the > average intensity throughout the time series is constant? > Thanks, > Kul > > Kul Karcz > Corporate Strategic Research > Resource Sciences Laboratory > ExxonMobil Research and Engineering Co. -- "Feel free" - 10 GB Mailbox, 100 FreeSMS/Monat ... Jetzt GMX TopMail testen: www.gmx.net/de/go/mailfooter/topmail-out William A. Mohler Dept. Genetics & DevBio UConn Health Center p: 860-679-1833 c: 860-985-2719 |
Dear commentators!
as far as I can read, the original post didn't mention beads or fluorescence. With the information provided by the original poster, I should like to judge the suggested solutions being not lege artis. >I would argue that in flourescence images the background might >remain constant even if illumination varies. Thus for fluor I would >suggest normalizing to the max value (if your fluor sources don't >vary much with time), or perhaps the mean intensity of a bright but >"biologically inert" region of the image. > >Bill > >-----Original Message----- > >From: Wolfgang Schechinger <[hidden email]> >Subj: Re: Normalizing intensities >Date: Sat Mar 24, 2007 7:36 am >Size: 1K >To: [hidden email] > >Dear Kul, > >As differences in light intensities during image recording will >affect both the background and the peak itensities, I suggest you to >normalize on the background from all images first, i.e generating >the normalizing factor for each image based on a reference image in >order that all images then have the same background. It probably >makes sense to use the image with the highest background as >reference as then you won't run into the risk of generating >overexposed images as the normalization factors will be smaller than >1. The image with the highest dynamic range also might be a good >choice if you can determine that easily. This will preserve the full >dynamics of your image stack. > >Can you do a calibration with - eg coloured photobleaching resistant beads? > >Best regards, > >Wolfgang > >-------- Original-Nachricht -------- >Datum: Fri, 23 Mar 2007 14:20:01 -0400 >Von: Kul Karcz <[hidden email]> >An: [hidden email] >Betreff: Normalizing intensities > >> Hello, >> I have a time series of 2D images. The average intensity of each image >> varies slightly from the others, probably because of small changes in the >> distance between the microscope's objective and the imaged plane. The end >> result is that when you run the time series it flickers. >> Can anyone suggest a good way to normalize the intensities, so that the >> average intensity throughout the time series is constant? >> Thanks, >> Kul >> >> Kul Karcz >> Corporate Strategic Research >> Resource Sciences Laboratory >> ExxonMobil Research and Engineering Co. > >-- >"Feel free" - 10 GB Mailbox, 100 FreeSMS/Monat ... >Jetzt GMX TopMail testen: www.gmx.net/de/go/mailfooter/topmail-out > > >William A. Mohler >Dept. Genetics & DevBio >UConn Health Center >p: 860-679-1833 >c: 860-985-2719 Best -- Herbie ------------------------ <http://www.gluender.de> |
Dear Colleagues,
The beads in my suggestion are external of course, I didn't expect them to be there in the sample already. But maybe some nonbiological signal, as William suggested, regardless, if it originates from fluorescence or not (the suggestion also is independent from the kind of signal you have - fluorescence or not- as long you have an image stack ^°_°^), should do the same job. I have to make a correction to my suggestion: Choosing the image with the highest background as reference for normalization is not good, as this will result in multiplication of images with lower background with a factr greater than 1 and thus a) increase the overall background and b) generate overexposed peaks if there are signals in the max range. It should be the frame with the *lowest* background, of course. Herbie, wouldn't you rather like to contribute to the discussion in a more constructive way? How would you solve the problem, that actually might be quite a common one? Best regards and a sunny sunday, Wo -------- Original-Nachricht -------- Datum: Sat, 24 Mar 2007 19:14:22 +0100 Von: Gluender <[hidden email]> An: [hidden email] Betreff: Re: Normalizing intensities > Dear commentators! > > as far as I can read, the original post didn't mention beads or > fluorescence. > > With the information provided by the original poster, I should like > to judge the suggested solutions being not lege artis. > > -- "Feel free" - 5 GB Mailbox, 50 FreeSMS/Monat ... Jetzt GMX ProMail testen: www.gmx.net/de/go/mailfooter/promail-out |
In reply to this post by Kul Karcz
Why doesn't whole image normalization work, or perhaps normalization of the whole image after
masking the objects of interest if they occupy only a small fraction of the filed of view? Maybe I'm missing some basic element of your question. Would it be possible to include a set of reference objects with known transmission ratios/differences? E.g., something like neutral density filters or a gray scale step object to calibrate to perform calibration (as can be done already in ImageJ) which could then be used to normalize. You'd still need to check a few things to validate this (e.g. flatness across the image), but perhaps this could be done. Is this a transmission or reflectance problem, and what size scale you are on? Maybe multiple layers of some film-like material of known transmission could be used, e.g. wratten fillters. Bill Christens-Barry |
In reply to this post by Wolfgang Schechinger
Thank you all for your suggestions. Let me clarify the setup here and my
question. The imaging is done in reflectance, so there are no fluorophores in the system - the "filter" is set to capture light from 537 to 547nm. The laser (a 543nm HeNe) was tested recently by the vendor (Leica) and is stable. The sample is a CaCO3 crystal, and we are interested in how it deforms under an applied load over time (>10 days). We image every 10 minutes. Embedding calibration beads is not easy because of the unique setup. My question was on how to use ImageJ to normalize the intensities of a set of images, i.e., right now I am mainly interested in how this can be done using the software. Thanks again Kul Kul Karcz Corporate Strategic Research ExxonMobil Research and Engineering Co. Wolfgang Schechinger <hubahopp@GMX. To DE> [hidden email] Sent by: cc ImageJ Interest Group Subject <[hidden email] Re: Normalizing intensities IH.GOV> 03/24/07 03:28 PM Please respond to ImageJ Interest Group <[hidden email] IH.GOV> Dear Colleagues, The beads in my suggestion are external of course, I didn't expect them to be there in the sample already. But maybe some nonbiological signal, as William suggested, regardless, if it originates from fluorescence or not (the suggestion also is independent from the kind of signal you have - fluorescence or not- as long you have an image stack ^°_°^), should do the same job. I have to make a correction to my suggestion: Choosing the image with the highest background as reference for normalization is not good, as this will result in multiplication of images with lower background with a factr greater than 1 and thus a) increase the overall background and b) generate overexposed peaks if there are signals in the max range. It should be the frame with the *lowest* background, of course. Herbie, wouldn't you rather like to contribute to the discussion in a more constructive way? How would you solve the problem, that actually might be quite a common one? Best regards and a sunny sunday, Wo -------- Original-Nachricht -------- Datum: Sat, 24 Mar 2007 19:14:22 +0100 Von: Gluender <[hidden email]> An: [hidden email] Betreff: Re: Normalizing intensities > Dear commentators! > > as far as I can read, the original post didn't mention beads or > fluorescence. > > With the information provided by the original poster, I should like > to judge the suggested solutions being not lege artis. > > -- "Feel free" - 5 GB Mailbox, 50 FreeSMS/Monat ... Jetzt GMX ProMail testen: www.gmx.net/de/go/mailfooter/promail-out |
My two cents:
first: find out the type of the intensity changes: e.g. only maximum is changing -> cut intensities down to a common preselected value e.g. intensity shift -> estimate the shift possibly from mode or maximum of smoothed image and correct e.g. linear transformation -> find the source, since that is not normal. Probably your camera is automatically changing parameters. Throw it away and get a controllable one. If data are unique and precious follow the attached recipe sketch: first: decide a normal mean and span in multiples of SD for display (mr, sdr, xr) second: smooth each image calculate mean and variance/SD per image from the smoothed version (me,sde) (reflectance delivers often strong intensitiy changes) scale (linearly) each image to that normal mean and span selected first using the calculated mean/SD Regards Karsten Am 26.03.2007 um 15:56 schrieb Kul Karcz: > Thank you all for your suggestions. Let me clarify the setup here > and my > question. > The imaging is done in reflectance, so there are no fluorophores in > the > system - the "filter" is set to capture light from 537 to 547nm. > The laser > (a 543nm HeNe) was tested recently by the vendor (Leica) and is > stable. > > The sample is a CaCO3 crystal, and we are interested in how it deforms > under an applied load over time (>10 days). We image every 10 minutes. > Embedding calibration beads is not easy because of the unique setup. > > My question was on how to use ImageJ to normalize the intensities > of a set > of images, i.e., right now I am mainly interested in how this can > be done > using the software. > Thanks again > Kul > > > Kul Karcz > Corporate Strategic Research > ExxonMobil Research and Engineering Co. > > > > > Wolfgang > Schechinger > > <hubahopp@GMX. To > DE> [hidden email] > Sent > by: cc > ImageJ > Interest Group > Subject > <[hidden email] Re: Normalizing intensities > IH.GOV> > > > 03/24/07 03:28 > PM > > > Please respond > to > ImageJ > Interest Group > <[hidden email] > IH.GOV> > > > > > > > Dear Colleagues, > > The beads in my suggestion are external of course, I didn't expect > them to > be there in the sample already. But maybe some nonbiological > signal, as > William suggested, regardless, if it originates from fluorescence > or not > (the suggestion also is independent from the kind of signal you have - > fluorescence or not- as long you have an image stack ^°_°^), should > do the > same job. > I have to make a correction to my suggestion: Choosing the image > with the > highest background as reference for normalization is not good, as > this will > result in multiplication of images with lower background with a factr > greater than 1 and thus a) increase the overall background and b) > generate > overexposed peaks if there are signals in the max range. It should > be the > frame with the *lowest* background, of course. > > Herbie, wouldn't you rather like to contribute to the discussion in > a more > constructive way? How would you solve the problem, that actually > might be > quite a common one? > > Best regards and a sunny sunday, > > Wo > > > -------- Original-Nachricht -------- > Datum: Sat, 24 Mar 2007 19:14:22 +0100 > Von: Gluender <[hidden email]> > An: [hidden email] > Betreff: Re: Normalizing intensities > >> Dear commentators! >> >> as far as I can read, the original post didn't mention beads or >> fluorescence. >> >> With the information provided by the original poster, I should like >> to judge the suggested solutions being not lege artis. >> >> > > -- > "Feel free" - 5 GB Mailbox, 50 FreeSMS/Monat ... > Jetzt GMX ProMail testen: www.gmx.net/de/go/mailfooter/promail-out |
In reply to this post by Bill Mohler
Dear Experts,
I noticed a somewhat unusual behaviour of the densitometry function (Ctrl 1/2/3). In the density plot, the range used to display the curves seems to depend on the minimum and the maximum values of the measured area. I noticed this when evaluating a scanned greywedge in order to calibrate my scanner and to find optimal settings for scanning: when selecting my ROIs somewhere in the medium densities of the wedge, the density lines still range from min to max in the density plot. When I select the whole wedge as ROI, I can see a nice set of steps, but still with the range of the plot's y axis adapted to the selected range of the wedge, though it does not use the whole range of 256 grey values. As long as I do only compare data from a single scan and have nice peaks with a common baseline, all that does not matter much, but as soon as one wants to compare different scans this issue should impose huge problem on data evaluation as the areas from different densitometries will have an unknown factor involved, due to the different spread of the y axis in the density plot. Calibrating the image before densitometry seems not to change anything. Is there a possibility to force the y axis in the density plots to a range from 0 to 255 greyscales or to an OD range? Thanks for your help! Wolfgang |
Wolfgang,
The standard plots are scaled to min/max in your selection. You can have a look at the following macro which uses a single plot panel for all profiles, with a common y axis scale. http://rsbweb.nih.gov/ij/macros/SinglePanelGelAnalyzer.txt Would you consider writing a custom macro, investigate the macro Plot.* functions. You'll probably find the Plot.setLimits() function useful if you want to specify x or y axis ranges. Sincerely, Jerome On 4/25/07, Wolfgang Schechinger <[hidden email]> wrote: > > Dear Experts, > > I noticed a somewhat unusual behaviour of the densitometry function > (Ctrl 1/2/3). In the density plot, the range used to display the > curves seems to depend on the minimum and the maximum values of the > measured area. > I noticed this when evaluating a scanned greywedge in order to > calibrate my scanner and to find optimal settings for scanning: when > selecting my ROIs somewhere in the medium densities of the wedge, the > density lines still range from min to max in the density plot. When I > select the whole wedge as ROI, I can see a nice set of steps, but > still with the range of the plot's y axis adapted to the selected > range of the wedge, though it does not use the whole range of 256 grey > values. > > As long as I do only compare data from a single scan and have nice > peaks with a common baseline, all that does not matter much, but as > soon as one wants to compare different scans this issue should impose > huge problem on data evaluation as the areas from different > densitometries will have an unknown factor involved, due to the > different spread of the y axis in the density plot. > > Calibrating the image before densitometry seems not to change anything. > > Is there a possibility to force the y axis in the density plots to a > range from 0 to 255 greyscales or to an OD range? > > Thanks for your help! > > Wolfgang > |
Free forum by Nabble | Edit this page |