Memory issues with gigapixel files

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Memory issues with gigapixel files

Rutger Wierda
Currently I'm trying to analyse IHC tissue Sections with ImageJ, but I'm constantly running into memory problems. Basically I want to know the percentage of positively nuclei in a Hematoxylin and DAB staining. For that purpose I count the total number of nuclei, by converting to 8-bit (grey scale) and run the nucleus counter plug-in. Thereafter I count the number of DAB stained nuclei. To do that, I run the colour deconvolution plug-in with the supplied H DAB vectors and run the nucleus counter on the resulting brown image. This is al performed in an 24" iMac with 4GB RAM and a 3.06 GHz dual core Intel processor.

The problem is that the image files are so large (~1+GB TIFF files) that ImageJ constantly runs out of memory. Even when I've set the maximum memory to 75% of my RAM. Thereafter I've set the maximum memory for ImageJ to 100% RAM, which didn't resolve my problems. Eventually I've ended up setting the maximum RAM to 20 GB, which of course results in massive swapping of files. Twenty Gigs of RAM will never fit in my machine, thus I'm looking for a more elegant solution for this problem.

Does anyone have suggestions for a better strategy/algorithm or an other solution to the memory problems?

Kind regards,

Rutger Wierda,
Leiden University Medical Centre,
The Netherlands.
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Peter Haub
Hi Rutger,
in case your gigapixel image consists of a number of separated TIFF
images - can you run the different steps of your analysis separately to
locate the sequence causing the problem.
Or is your gigapixel image stored in a single TIFF file? What is the
file format? How do you open it?
Regards,
Peter

Am 24.01.2011 12:48, schrieb Rutger Wierda:

> Currently I'm trying to analyse IHC tissue Sections with ImageJ, but I'm constantly running into memory problems. Basically I want to know the percentage of positively nuclei in a Hematoxylin and DAB staining. For that purpose I count the total number of nuclei, by converting to 8-bit (grey scale) and run the nucleus counter plug-in. Thereafter I count the number of DAB stained nuclei. To do that, I run the colour deconvolution plug-in with the supplied H DAB vectors and run the nucleus counter on the resulting brown image. This is al performed in an 24" iMac with 4GB RAM and a 3.06 GHz dual core Intel processor.
>
> The problem is that the image files are so large (~1+GB TIFF files) that ImageJ constantly runs out of memory. Even when I've set the maximum memory to 75% of my RAM. Thereafter I've set the maximum memory for ImageJ to 100% RAM, which didn't resolve my problems. Eventually I've ended up setting the maximum RAM to 20 GB, which of course results in massive swapping of files. Twenty Gigs of RAM will never fit in my machine, thus I'm looking for a more elegant solution for this problem.
>
> Does anyone have suggestions for a better strategy/algorithm or an other solution to the memory problems?
>
> Kind regards,
>
> Rutger Wierda,
> Leiden University Medical Centre,
> The Netherlands.
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Rutger Wierda
In reply to this post by Rutger Wierda
Dear Peter,

I'm not really sure whether I understand correctly what extra info you want to know from me, but I'll give it a try:

My TIFFs are originally exported from the microscopy software a 'tiled tiffs'. However ImageJ can't cope with tiled tiff files, so I open them in Photoshop CS5 and save them as regular 8-bit tiff files, with LZW compression. (interleaved pixel order, IMB PC byte order). The resulting files are opened in ImageJ (using file>open). So far this doesn't cause any problems. Then when I run the Color deconvolution plugin (I've downloaded it from the ImageJ website) the problem's start. When I run the nucleus counter plugin it seems that even more memory is needed. (In the nucleus counter, also downloaded from ImageJ's plugin section, I select Otsu 3x3 as thresold method and tick the watershed filter)

Maybe I could run the analysis on the separate tiles of the tiled tiff, however I have no clue whatsoever how to get the tiles out of the Tiffs and run them trough image J.

I hope this gives you the information you need,
Kind regards,

Rutger
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Gabriel Landini
On Tuesday 25 Jan 2011 15:22:59 you wrote:
> My TIFFs are originally exported from the microscopy software a 'tiled
> tiffs'. However ImageJ can't cope with tiled tiff files, so I open them in
> Photoshop CS5 and save them as regular 8-bit tiff files, with LZW
> compression. (interleaved pixel order, IMB PC byte order). The resulting
> files are opened in ImageJ (using file>open). So far this doesn't cause
> any problems. Then when I run the Color deconvolution plugin (I've
> downloaded it from the ImageJ website) the problem's start.

The colour deconvolution plugin creates three additional 8 bit files. So you
need all that extra space as well as the original RGB on memory.
Whether you LZW compress the images does not make difference, because the data
has to be decompressed to be displayed.

The only options I see are:

If you are able to run the plugin to completion, then save the images, close
them, and work from there without the original.

Cut you image in 4 (or more) tiles, analyse the tiles individually.

Use smaller images (resize it). If only counting nuclei, maybe you can still
do that with images which are smaller.

Buy more RAM. You can't expect to do things comfortably without the memory
needed to hold your data in the first place.

Cheers
G.
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Peter Haub
In reply to this post by Rutger Wierda
As Gabriel suggest
either cut your image into smaller once by copy ROI and paste into new
smaller images (you can not directly extract the TIFF internal tile
structure into separated images)
or resize your image,
or use a 64bit OS and more memory
or check if your microscopy software can save the image in several
images instead of one gigapixel image.

  Peter

Am 25.01.2011 16:22, schrieb Rutger Wierda:

> Dear Peter,
>
> I'm not really sure whether I understand correctly what extra info you want to know from me, but I'll give it a try:
>
> My TIFFs are originally exported from the microscopy software a 'tiled tiffs'. However ImageJ can't cope with tiled tiff files, so I open them in Photoshop CS5 and save them as regular 8-bit tiff files, with LZW compression. (interleaved pixel order, IMB PC byte order). The resulting files are opened in ImageJ (using file>open). So far this doesn't cause any problems. Then when I run the Color deconvolution plugin (I've downloaded it from the ImageJ website) the problem's start. When I run the nucleus counter plugin it seems that even more memory is needed. (In the nucleus counter, also downloaded from ImageJ's plugin section, I select Otsu 3x3 as thresold method and tick the watershed filter)
>
> Maybe I could run the analysis on the separate tiles of the tiled tiff, however I have no clue whatsoever how to get the tiles out of the Tiffs and run them trough image J.
>
> I hope this gives you the information you need,
> Kind regards,
>
> Rutger
>
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Rutger Wierda
In reply to this post by Rutger Wierda
Dear all,

Thanks for all the replies. I thought I would check other solutions for my problem, before buying new memory or a new computer for this project. I'll try out the cutting into smaller pieces solution (and buy some RAM of course). Using ImageMagick I can probably run this in a batch for all the IHC images.

Regards,
Rutger
Reply | Threaded
Open this post in threaded view
|

Re: Memory issues with gigapixel files

Prodanov Dimiter
In reply to this post by Rutger Wierda
Hi
Why don't you resample your files to become more manageable?
Best regards,

Dimiter