I am mostly working with micro-CT images, so the min/max contrast setting is very important as it defines the range of interest when applying LUTs and exporting PNGs for use in reports.
I have this issue:
The command run("Scale...", ...
does not retain the min,max contrast setting for 8-bit stacks if the Z-dimension is changed.
It works okay for 16 bit stacks, and it works okay if the Z scale is kept at 1.0
The problem occurs both in macros and if the Scale... command is called from the UI.
Here is a macro reproducing the problem:
//first scale a 16-bit image
run("T1 Head (2.4M, 16-bits)");
setMinAndMax(0, 256);
run("Scale...", "x=.5 y=.5 z=.5 width=128 height=128 depth=64 interpolation=Bilinear average process create title=t1-head-16bit-scaled.tif");
setSlice(32);
//result is ok
//then scale a 8-bit image in XY, keeping Z dimension
run("T1 Head (2.4M, 16-bits)");
setSlice(64);
run("8-bit");
setMinAndMax(0, 128);
run("Scale...", "x=.5 y=.5 z=1.0 width=128 height=128 depth=129 interpolation=Bilinear average process create title=t1-head-8bit-xy0.5-z1.0.tif");
setSlice(64);
//result is ok
//last, scale a 8-bit image in all XYZ dimensions
run("T1 Head (2.4M, 16-bits)");
setSlice(64);
run("8-bit");
setMinAndMax(0, 128);
run("Scale...", "x=.5 y=.5 z=.5 width=128 height=128 depth=64 interpolation=Bilinear average process create title=t1-head-8bit-xyz0.5.tif");
setSlice(32);
//result is wrong, the min-max settings are lost
run("Tile");
Stein
--
ImageJ mailing list:
http://imagej.nih.gov/ij/list.html