Dear list,
I am working on a dual camera spinning disk system and would like to use ImageJ to correct for chromatic shift between the two channels (the two cameras). I have data from fluorescent beads and would like to use this data to calculate the shifts (and potential warping) between the channels and apply the necessary transformations to my data (dual channel, z-stack, time course datasets). Is there a plugin that can do this? I have read the entry on imagej.net ( http://imagej.net/Chromatic_shift_origins_measurement_and_correction ), but I have not found an automated way to get ImageJ to use the bead data to apply corrections to my actual data. How are you handling this type of problem (having to correct for chromatic shifts between channels)? Best, Rafael Buono -- ImageJ mailing list: http://imagej.nih.gov/ij/list.html |
Hi Rafael,
we developed such a method recently and I generated a slightly premature screen capture how to do it. Instead of polishing it further, we decided to upload the screen cast to Youtube https://www.youtube.com/watch?v=lPt-WQuniUs and link the relevant scripts and examples on GitHub https://github.com/saalfeldlab/confocal-lens/tree/master/scripts You will need a dense textured sample that looks approximately the same over the entire spectrum and image it as a 4x4 mosaic of ~60% overlapping tiles. We used beads. Best, Stephan On Mon, 2016-11-28 at 15:08 -0500, Rafael Buono wrote: > Dear list, > > I am working on a dual camera spinning disk system and would like to > use ImageJ to correct for chromatic shift between the two channels > (the two cameras). > I have data from fluorescent beads and would like to use this data to > calculate the shifts (and potential warping) between the channels and > apply the necessary transformations to my data (dual channel, z- > stack, time course datasets). Is there a plugin that can do this? > > I have read the entry on imagej.net ( http://imagej.net/Chromatic_shi > ft_origins_measurement_and_correction ), but I have not found an > automated way to get ImageJ to use the bead data to apply corrections > to my actual data. How are you handling this type of problem (having > to correct for chromatic shifts between channels)? > > Best, > > Rafael Buono > > -- > ImageJ mailing list: http://imagej.nih.gov/ij/list.html ImageJ mailing list: http://imagej.nih.gov/ij/list.html signature.asc (484 bytes) Download Attachment |
In reply to this post by Rafael Buono
Hi Stephan,
Thank you for the prompt answer, the helpful video, and the cool scripts. I will give it a shot with the next wave of data. A couple of questions: -Does it need to be done on 4x4 tiles or is it only if I am going to be doing tilling with my data? -How do I apply the transforms obtained from the bead data to my actual data? Best, Rafael -- ImageJ mailing list: http://imagej.nih.gov/ij/list.html |
Hi Rafael,
The 4x4 tiles are required to calculate the distortion, so the answer is yes, you will need to do this, but then you have a calibration for this particular setup of your microscope that can be re-used for all subsequent acquisitions, tiles or not. Documentation is sparse to the level that it does not exist, so let's start here. Following an e-mail that I sent out to the Fly Light project scientists who are testing this: After you followed the instructions from the video, check this place again https://github.com/saalfeldlab/confocal-lens/tree/master/scripts and open the script export-transforms.bsh in Fiji's script editor and modify https://github.com/saalfeldlab/confocal-lens/blob/master/scripts/export -transforms.bsh#L13-L21 according to your setup. Hit Run. It should print the estimated transformations into the log window and you can save this as a JSON file (text file). Now drop the file apply-lens.bsh into the plugin directory of your Fiji installation. After a restart of Fiji, you should find a menu-entry for this script in the plugins menu. Click it. If your Fiji is not too old, then it should open a dialog that asks you for two input image paths (expected is 1. LSM file with two channels (488 and 594nm) 2. LSM file with three channels (488, 561, and 647nm) ), the respective JSON file for the scope that was used, and an output directory to save the resulting multi-channel file. The output will have all five or however many channels in a single file. If your acquisition procedure is different, change the script. The script currently assumes that the lens-models in the JSON file and the channels from input images are in the same order (in this case 488, 594, 488, 561, 647nm). You can do subsets or other combinations but you would have to edit the JSON file accordingly. The JSON files are simple to read, check https://github.com/saalfeldlab/confocal-lens/blob/master/scripts/scope1.json and they contain name tags for each model. That should make editing them easier. The script ignores the name tags and goes by the order only. Attached is an example showing the top left 200px corner of all five channels of one stack of the scope 1 calibration sample before and after correction. The script should be macro-recordable. On Tue, 2016-11-29 at 15:55 -0500, Rafael Buono wrote: > Hi Stephan, > > Thank you for the prompt answer, the helpful video, and the cool > scripts. I will give it a shot with the next wave of data. > A couple of questions: > -Does it need to be done on 4x4 tiles or is it only if I am going to > be doing tilling with my data? > -How do I apply the transforms obtained from the bead data to my > actual data? > > Best, > > Rafael > > -- > ImageJ mailing list: http://imagej.nih.gov/ij/list.html -- ImageJ mailing list: http://imagej.nih.gov/ij/list.html multi-lens-scop1-example.png (298K) Download Attachment signature.asc (484 bytes) Download Attachment |
Free forum by Nabble | Edit this page |