I have a stack of hundreds of images (x,y,time) acquired from a microscope. Ideally there should only be three levels of contrast: hi, mid, lo (or -1,0,1 etc however you prefer)
I wrote an app to track how each pixel changes over time between the three states but the contrast levels change between each frame (and sometimes within a given frame) so I can't properly set a threshold (e.g. less than 80 is lo, greater than 120 is high, and everything in between is mid).
If I look at a histogram for any given slice, I can clearly see three Gaussian-like peaks, but the positions and shapes of those peaks change every frame.
Knowing that there should only be three states, how can I best normalize the stack so I can properly track each pixel?
I attached an image showing two example frames and their corresponding histograms.
tri-level.tif