The granulometric filtering plugin generates a lot of space: I want to analyse a large montage imag, and reduce crashes due to memory fill-up by non-manually collecting garbage during the plugin analysis. Any ideas?
|
Hi Teresa,
> The granulometric filtering plugin generates a lot of space: I want to > analyse a large montage imag, and reduce crashes due to memory fill-up > by non-manually collecting garbage during the plugin analysis. Any > ideas? Java always automatically calls the garbage collector when the heap is getting full. [1] So while you could, e.g., write a little plugin that repeatedly calls System.gc() in a loop, it would very likely not solve your fundamental problem, which is that Java *is* actually running out of memory. (The only exception to this is a rare case where Java decides that garbage collection is happening too slowly, in which case you should see the message "GC overhead limit exceeded" [2]). The first thing to do is make sure that ImageJ has a large enough "maximum heap" size. * Edit > Options > Memory & Threads * Change "Maximum Memory" to something larger (at most, 1000 MB less than your computer's total RAM). However, if you run Fiji, it will have already tried to guess a good value (something like 2/3 available RAM, IIRC). If you are already at the limits of your computer's physical memory, the next step would be to add more. Alternately, to address things on the "demand" side rather than the "supply" side, the granulometric filtering plugin could be improved to be more space efficient. Algorithmically, there is often a tradeoff between space complexity and time complexity (i.e., use more memory, or take more time to execute?). Presumably, the plugin you are talking about is this one: http://rsbweb.nih.gov/ij/plugins/gran-filter.html If so, I have CCed the author in case he has any additional comments. Regards, Curtis [1] http://stackoverflow.com/questions/8719071 [2] http://www.petefreitag.com/item/746.cfm On Wed, Apr 24, 2013 at 2:25 AM, Teresa W <[hidden email]> wrote: > The granulometric filtering plugin generates a lot of space: I want to > analyse a large montage imag, and reduce crashes due to memory fill-up by > non-manually collecting garbage during the plugin analysis. Any ideas? > > > > -- > View this message in context: > http://imagej.1557.x6.nabble.com/how-can-I-run-collect-garbage-automatically-while-a-plugin-is-running-tp5002772.html > Sent from the ImageJ mailing list archive at Nabble.com. > > -- > ImageJ mailing list: http://imagej.nih.gov/ij/list.html > -- ImageJ mailing list: http://imagej.nih.gov/ij/list.html |
Thanks Curtis,
I asked because I noticed that manually running collect garbage when Gran Filter function called up a new image (yes it is Dimiter Prodanov's function as you noted) definitely halted a rapid fill up of my memory so that I could run the function for longer without crashing. Please note that I am programming deficient, and use IJ from interface only.As a result, some of my fixes are probably not what happens in batch mode, plus I am running Gran_filt on a rather large montage image. To mention also, if Dimiter is reading, a.The function is supercool really, BUT it took me about a week to work out how to get sized data back from the output images. I did this by summing working Analyse Particles from large to small, and subtracting the summed masks of the previous Analyse Particles from the next image to be counted. I also set Analyse Particles size limits at the low end for each filter size, to remove a lot of small particles that somehow appeared in the Gran_Filt output images. Suggestions for post-processing in the .html would help simpletons like me to save A LOT of time. b. I have no idea what the density function graph generated by Gran-Filt refers to. Is it some sort of variance analysis? I feel as if I should understand it from the .html, but.... Teresa |
Free forum by Nabble | Edit this page |