Login  Register

Memory issues with gigapixel files

Posted by Rutger Wierda on Jan 24, 2011; 11:48am
URL: http://imagej.273.s1.nabble.com/Memory-issues-with-gigapixel-files-tp3685899.html

Currently I'm trying to analyse IHC tissue Sections with ImageJ, but I'm constantly running into memory problems. Basically I want to know the percentage of positively nuclei in a Hematoxylin and DAB staining. For that purpose I count the total number of nuclei, by converting to 8-bit (grey scale) and run the nucleus counter plug-in. Thereafter I count the number of DAB stained nuclei. To do that, I run the colour deconvolution plug-in with the supplied H DAB vectors and run the nucleus counter on the resulting brown image. This is al performed in an 24" iMac with 4GB RAM and a 3.06 GHz dual core Intel processor.

The problem is that the image files are so large (~1+GB TIFF files) that ImageJ constantly runs out of memory. Even when I've set the maximum memory to 75% of my RAM. Thereafter I've set the maximum memory for ImageJ to 100% RAM, which didn't resolve my problems. Eventually I've ended up setting the maximum RAM to 20 GB, which of course results in massive swapping of files. Twenty Gigs of RAM will never fit in my machine, thus I'm looking for a more elegant solution for this problem.

Does anyone have suggestions for a better strategy/algorithm or an other solution to the memory problems?

Kind regards,

Rutger Wierda,
Leiden University Medical Centre,
The Netherlands.