Login  Register

Re: Memory issues with gigapixel files

Posted by Peter Haub on Jan 25, 2011; 6:56am
URL: http://imagej.273.s1.nabble.com/Memory-issues-with-gigapixel-files-tp3685899p3685905.html

Hi Rutger,
in case your gigapixel image consists of a number of separated TIFF
images - can you run the different steps of your analysis separately to
locate the sequence causing the problem.
Or is your gigapixel image stored in a single TIFF file? What is the
file format? How do you open it?
Regards,
Peter

Am 24.01.2011 12:48, schrieb Rutger Wierda:

> Currently I'm trying to analyse IHC tissue Sections with ImageJ, but I'm constantly running into memory problems. Basically I want to know the percentage of positively nuclei in a Hematoxylin and DAB staining. For that purpose I count the total number of nuclei, by converting to 8-bit (grey scale) and run the nucleus counter plug-in. Thereafter I count the number of DAB stained nuclei. To do that, I run the colour deconvolution plug-in with the supplied H DAB vectors and run the nucleus counter on the resulting brown image. This is al performed in an 24" iMac with 4GB RAM and a 3.06 GHz dual core Intel processor.
>
> The problem is that the image files are so large (~1+GB TIFF files) that ImageJ constantly runs out of memory. Even when I've set the maximum memory to 75% of my RAM. Thereafter I've set the maximum memory for ImageJ to 100% RAM, which didn't resolve my problems. Eventually I've ended up setting the maximum RAM to 20 GB, which of course results in massive swapping of files. Twenty Gigs of RAM will never fit in my machine, thus I'm looking for a more elegant solution for this problem.
>
> Does anyone have suggestions for a better strategy/algorithm or an other solution to the memory problems?
>
> Kind regards,
>
> Rutger Wierda,
> Leiden University Medical Centre,
> The Netherlands.