Flush/clear memory in ImageJ/Fiji

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
16 messages Options
Reply | Threaded
Open this post in threaded view
|

Flush/clear memory in ImageJ/Fiji

mjlm
Hi there,
Is there a way (e.g. a macro command) to clear out unused, but still allocated memory in ImageJ (I'm using Fiji under Win7 64bit)? When I open a large file and then close it, the memory used for that large file is sometimes not cleared. When I then try to open another large file, I get a warning of insufficient memory.

I've tried to trigger Java garbage collection with call("java.lang.System.gc"), but it didn't have any effect. My problem usually occurs when opening large files using the LOCI tools.

Thanks,
Matthias
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Albert Cardona-2
2011/10/19 mjlm <[hidden email]>:

> Hi there,
> Is there a way (e.g. a macro command) to clear out unused, but still
> allocated memory in ImageJ (I'm using Fiji under Win7 64bit)? When I open a
> large file and then close it, the memory used for that large file is
> sometimes not cleared. When I then try to open another large file, I get a
> warning of insufficient memory.
>
> I've tried to trigger Java garbage collection with
> call("java.lang.System.gc"), but it didn't have any effect. My problem
> usually occurs when opening large files using the LOCI tools.


Matthias,

launch the JVM with different flags. Fine-tuning the JVM is an art all
by itself. Here are the parameters that worked for me, along with some
explanations:

http://fiji.sc/wiki/index.php/TrakEM2#Running_fiji_for_heavy-duty.2C_memory-intensive.2C_high-performance_TrakEM2_tasks


Of course, the cheapest of all solutions is to buy more RAM.

Albert

--
http://albert.rierol.net
http://www.ini.uzh.ch/~acardona/
Reply | Threaded
Open this post in threaded view
|

Labeling Stacks

Herbert M. Geller
In reply to this post by mjlm
Hi all,

I am trying to label a stack using the Stack Label command, with the
"Use Text Tool" font.  I selected the Symbol font in the Text Tool
widget, but all I get is white boxes where the symbols should be.

Any help is appreciated.

Herb

--
--------------------------------------
Herbert M. Geller, Ph.D.
Developmental Neurobiology Section
National Heart Lung and Blood Institute, NIH
10 Center Drive MSC 1754
Bldg 10, Room 6D18
Bethesda, MD  20892-1754
Tel: 301-451-9440; Fax: 301-594-8133
e-mail: [hidden email]
Web: http://dir.nhlbi.nih.gov/labs/ldn/index.asp
---------------------------------------
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

ctrueden
In reply to this post by mjlm
Hi Matthias,

Is there a way (e.g. a macro command) to clear out unused, but still
> allocated memory in ImageJ (I'm using Fiji under Win7 64bit)?
>

Clicking on the ImageJ status bar triggers a garbage collection, similar to
the command you mention. You can try clicking 2 or 3 times, and see whether
the reported memory usage drops.

When I open a large file and then close it, the memory used for that large
> file is sometimes not cleared. When I then try to open another large file, I
> get a warning of insufficient memory.
>

There are several reasons why this can happen. Sometimes the garbage
collector can't keep up with new memory allocation. But often the issue is a
bug (specifically, a memory leak) in the program.

My problem usually occurs when opening large files using the LOCI tools.
>

We would like to fix any problems in the Bio-Formats code, but would need
instructions on how to duplicate the issue. How much memory are you
allocating to ImageJ? In this case, it might be easier to replicate the
issue by reducing ImageJ's available maximum memory. Which OS are you
running? Are you using the "Virtual stack" option? Do you have a sample
dataset you could send that exhibits the problem?

Regards,
Curtis


On Wed, Oct 19, 2011 at 4:31 AM, mjlm <[hidden email]> wrote:

> Hi there,
> Is there a way (e.g. a macro command) to clear out unused, but still
> allocated memory in ImageJ (I'm using Fiji under Win7 64bit)? When I open a
> large file and then close it, the memory used for that large file is
> sometimes not cleared. When I then try to open another large file, I get a
> warning of insufficient memory.
>
> I've tried to trigger Java garbage collection with
> call("java.lang.System.gc"), but it didn't have any effect. My problem
> usually occurs when opening large files using the LOCI tools.
>
> Thanks,
> Matthias
>
> --
> View this message in context:
> http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-tp6907896p6907896.html
> Sent from the ImageJ mailing list archive at Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Unruh, Jay-2
In reply to this post by Albert Cardona-2
So does the typical FIJI launcher disable the java garbage collector?  In regular ImageJ I am typically successful in clearing out memory with the garbage collector.  Buying more RAM doesn't improve things if you are running 32-bit windows.  Of course, perhaps I should buy a 64-bit machine, but I find it hard to require all of my users to buy such machines or come to a central facility to process their images (In that case, many of them may opt out of ImageJ anyway).

On a side note, I have experienced memory leaks/inefficient memory freeing with a recent version of LOCI even with using the garbage collector.  These are especially problematic when opening leica files which require large amounts of RAM to browse, but not necessarily to load.  One of my users could only open a few images before having to crash and restart ImageJ.  I rolled back to the second most recent version and that seemed to fix things.

Jay

-----Original Message-----
From: ImageJ Interest Group [mailto:[hidden email]] On Behalf Of Albert Cardona
Sent: Wednesday, October 19, 2011 11:44 AM
To: [hidden email]
Subject: Re: Flush/clear memory in ImageJ/Fiji

2011/10/19 mjlm <[hidden email]>:

> Hi there,
> Is there a way (e.g. a macro command) to clear out unused, but still
> allocated memory in ImageJ (I'm using Fiji under Win7 64bit)? When I
> open a large file and then close it, the memory used for that large
> file is sometimes not cleared. When I then try to open another large
> file, I get a warning of insufficient memory.
>
> I've tried to trigger Java garbage collection with
> call("java.lang.System.gc"), but it didn't have any effect. My problem
> usually occurs when opening large files using the LOCI tools.


Matthias,

launch the JVM with different flags. Fine-tuning the JVM is an art all by itself. Here are the parameters that worked for me, along with some
explanations:

http://fiji.sc/wiki/index.php/TrakEM2#Running_fiji_for_heavy-duty.2C_memory-intensive.2C_high-performance_TrakEM2_tasks


Of course, the cheapest of all solutions is to buy more RAM.

Albert

--
http://albert.rierol.net
http://www.ini.uzh.ch/~acardona/
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

ctrueden
Hi Jay,

So does the typical FIJI launcher disable the java garbage collector?
>

No, there isn't really an alternative for managing memory inside the JVM.
But as Albert indicated there are lots of ways to tune
it<http://www.oracle.com/technetwork/java/gc-tuning-5-138395.html>
.

Buying more RAM doesn't improve things if you are running 32-bit windows.
>  Of course, perhaps I should buy a 64-bit machine, but I find it hard to
> require all of my users to buy such machines
>

Now that 64-bit is becoming ubiquitous and 8GB+ of RAM is commonplace,
you'll have a harder and harder time supporting older 32-bit OSes. Rule #6
of software programming: "Software expands to consume all available
resources." :-)

On a side note, I have experienced memory leaks/inefficient memory freeing
> with a recent version of LOCI even with using the garbage collector.  These
> are especially problematic when opening leica files which require large
> amounts of RAM to browse, but not necessarily to load.  One of my users
> could only open a few images before having to crash and restart ImageJ.  I
> rolled back to the second most recent version and that seemed to fix things.
>

We would very much like to fix any memory leaks in Bio-Formats. Would you be
willing to send more specific details? Which kind of Leica files? How much
RAM did you have allocated? Which OS? Is it possible to reliably reproduce
with a macro or script on your systems? Which version do you mean by "second
most recent"?

Thanks,
Curtis


On Fri, Oct 21, 2011 at 11:52 AM, Unruh, Jay <[hidden email]> wrote:

> So does the typical FIJI launcher disable the java garbage collector?  In
> regular ImageJ I am typically successful in clearing out memory with the
> garbage collector.  Buying more RAM doesn't improve things if you are
> running 32-bit windows.  Of course, perhaps I should buy a 64-bit machine,
> but I find it hard to require all of my users to buy such machines or come
> to a central facility to process their images (In that case, many of them
> may opt out of ImageJ anyway).
>
> On a side note, I have experienced memory leaks/inefficient memory freeing
> with a recent version of LOCI even with using the garbage collector.  These
> are especially problematic when opening leica files which require large
> amounts of RAM to browse, but not necessarily to load.  One of my users
> could only open a few images before having to crash and restart ImageJ.  I
> rolled back to the second most recent version and that seemed to fix things.
>
> Jay
>
> -----Original Message-----
> From: ImageJ Interest Group [mailto:[hidden email]] On Behalf Of
> Albert Cardona
> Sent: Wednesday, October 19, 2011 11:44 AM
> To: [hidden email]
> Subject: Re: Flush/clear memory in ImageJ/Fiji
>
> 2011/10/19 mjlm <[hidden email]>:
> > Hi there,
> > Is there a way (e.g. a macro command) to clear out unused, but still
> > allocated memory in ImageJ (I'm using Fiji under Win7 64bit)? When I
> > open a large file and then close it, the memory used for that large
> > file is sometimes not cleared. When I then try to open another large
> > file, I get a warning of insufficient memory.
> >
> > I've tried to trigger Java garbage collection with
> > call("java.lang.System.gc"), but it didn't have any effect. My problem
> > usually occurs when opening large files using the LOCI tools.
>
>
> Matthias,
>
> launch the JVM with different flags. Fine-tuning the JVM is an art all by
> itself. Here are the parameters that worked for me, along with some
> explanations:
>
>
> http://fiji.sc/wiki/index.php/TrakEM2#Running_fiji_for_heavy-duty.2C_memory-intensive.2C_high-performance_TrakEM2_tasks
>
>
> Of course, the cheapest of all solutions is to buy more RAM.
>
> Albert
>
> --
> http://albert.rierol.net
> http://www.ini.uzh.ch/~acardona/
>
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Bahram
Hi Curtis
I have the exact same problem with Fiji and .lif files:

My system:
Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz
Loci revision  281b1fa, 16.2.2011, release 4.3 -DEV
ImageJA 1.45b, 18405 MBs RAM allocated to ImageJ

This happens every single time i open a lif (typically very large time-lapse sequences), so now i just reopen fiji after each movie.

cheers
Bahram
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Bahram
In reply to this post by ctrueden
Hi Curtis
I have the exact same problem with Fiji and .lif files:

My system:
Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz
Loci revision  281b1fa, 16.2.2011, release 4.3 -DEV
ImageJA 1.45b, 18405 MBs RAM allocated to ImageJ

This happens every single time i open a lif (typically very large time-lapse sequences), so now i just reopen fiji after each movie.

cheers
Bahram
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

ctrueden
Hi Bahram,

I have the exact same problem with Fiji and .lif files:
>

Unfortunately, I am unable to replicate the problem with today's (fully
updated) version of Fiji on a Windows 7 64-bit system with 2GB, with 1.5GB
allocated to Fiji.

I tried repeatedly opening a ~1.1GB LIF file, both with and without the
"Use virtual stack" option checked. After the image windows opened, I
browsed repeatedly browsed back and forth through time, then closed all
windows. Each time, the memory use returned to the baseline level.

Perhaps the problem occurs only with certain LIF files? Or only with LIF
files beyond a certain size? Is it possible to reliably reproduce with a
macro or script on your systems? If so, would you be willing to send such a
macro or script and/or sample data that illustrates the issue?

Thanks,
Curtis


On Fri, Nov 4, 2011 at 7:48 AM, Bahram <[hidden email]>wrote:

> Hi Curtis
> I have the exact same problem with Fiji and .lif files:
>
> My system:
> Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz
> Loci revision  281b1fa, 16.2.2011, release 4.3 -DEV
> ImageJA 1.45b, 18405 MBs RAM allocated to ImageJ
>
> This happens every single time i open a lif (typically very large
> time-lapse
> sequences), so now i just reopen fiji after each movie.
>
> cheers
> Bahram
>
> --
> View this message in context:
> http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-tp6907896p6962634.html
> Sent from the ImageJ mailing list archive at Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Unruh, Jay-2
Curtis,

Sorry about the delay, but I finally got around to testing this.  I tried out two lif files.  One with a total size of 1.8 GB and another with a total size of 6.8 GB overall.  

The 1.8 GB file worked well under all operating conditions, even on 32-bit java (1024 MB allocated) when loading an individual image as big as 900 MB.

The second 6.8 GB file would not load under 32-bit Java even with a virtual stack.  The individual file I was loading was only 4 MB.  I ran under 64-bit java with 4 gigs of ram allocated and the file loaded fine, but an examination of the memory usage revealed that it spiked to 1500 MB during loading and stayed at 650 GB after the file loaded, even when I explicitly ran the garbage collector either by clicking the status bar or running my own GC plugin.  The memory usage returned to almost zero after I closed the Image.  This suggests that it isn't a strict "memory leak" but perhaps an static object that isn't cleared after loading.  Note that I am using the last stable build from the website.

When I load the loci_tools.jar file generated on June 10, 2011 (this is version 4.2.2) the memory only goes as high as 150 MB during loading and settles to 57 MB after loading.  This is still far higher than the size of the file, but not unreasonable given the typical load of the virtual machine.

I hope this helps.  If you really need a large file to test, we will have to arrange some way to transfer a 7 gig file over the network.  I can't share that exact file as it belongs to a user, but I could generate a file of similar size for testing if you need it.

Below is the error message that was generated loading the 6.8 GB file with 32-bit java.

Jay

java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Unknown Source)
        at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
        at java.lang.AbstractStringBuilder.append(Unknown Source)
        at java.lang.StringBuffer.append(Unknown Source)
        at java.io.StringWriter.write(Unknown Source)
        at com.sun.org.apache.xml.internal.serializer.ToStream.characters(Unknown Source)
        at com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown Source)
        at com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transformIdentity(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown Source)
        at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown Source)
        at loci.common.xml.XMLTools.getXML(XMLTools.java:161)
        at loci.formats.services.OMEXMLServiceImpl.getOMEXML(OMEXMLServiceImpl.java:337)
        at loci.plugins.in.ImagePlusReader.createFileInfo(ImagePlusReader.java:516)
        at loci.plugins.in.ImagePlusReader.readImage(ImagePlusReader.java:304)
        at loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:236)
        at loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:214)
        at loci.plugins.in.ImagePlusReader.openImagePlus(ImagePlusReader.java:112)
        at loci.plugins.in.Importer.readPixels(Importer.java:134)
        at loci.plugins.in.Importer.run(Importer.java:87)
        at loci.plugins.LociImporter.run(LociImporter.java:79)
        at ij.IJ.runUserPlugIn(IJ.java:183)
        at ij.IJ.runPlugIn(IJ.java:150)

-----Original Message-----
From: ImageJ Interest Group [mailto:[hidden email]] On Behalf Of Curtis Rueden
Sent: Friday, November 04, 2011 11:02 AM
To: [hidden email]
Subject: Re: Flush/clear memory in ImageJ/Fiji

Hi Bahram,

I have the exact same problem with Fiji and .lif files:
>

Unfortunately, I am unable to replicate the problem with today's (fully
updated) version of Fiji on a Windows 7 64-bit system with 2GB, with 1.5GB allocated to Fiji.

I tried repeatedly opening a ~1.1GB LIF file, both with and without the "Use virtual stack" option checked. After the image windows opened, I browsed repeatedly browsed back and forth through time, then closed all windows. Each time, the memory use returned to the baseline level.

Perhaps the problem occurs only with certain LIF files? Or only with LIF files beyond a certain size? Is it possible to reliably reproduce with a macro or script on your systems? If so, would you be willing to send such a macro or script and/or sample data that illustrates the issue?

Thanks,
Curtis


On Fri, Nov 4, 2011 at 7:48 AM, Bahram <[hidden email]>wrote:

> Hi Curtis
> I have the exact same problem with Fiji and .lif files:
>
> My system:
> Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz Loci
> revision  281b1fa, 16.2.2011, release 4.3 -DEV ImageJA 1.45b, 18405
> MBs RAM allocated to ImageJ
>
> This happens every single time i open a lif (typically very large
> time-lapse sequences), so now i just reopen fiji after each movie.
>
> cheers
> Bahram
>
> --
> View this message in context:
> http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-t
> p6907896p6962634.html Sent from the ImageJ mailing list archive at
> Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

ctrueden
Hi Jay,

The second 6.8 GB file would not load under 32-bit Java even with a virtual
> stack.
>

Thanks for the additional information. My guess is that the Bio-Formats
file reader is parsing a large amount of metadata into OME-XML (a fairly
inefficient structure in some ways). Even with virtual stacks on, we still
cache the metadata for the entire dataset. This can become expensive when
the data has per-plane timestamps or stage positions with a large number of
planes total.

We are working on a refactoring of Bio-Formats that will greatly reduce
this problem, but in the meantime, I am not sure if there is anything you
can do about it. I am CCing the rest of the Bio-Formats team in case they
have any additional insight.

When I load the loci_tools.jar file generated on June 10, 2011 (this is
> version 4.2.2) the memory only goes as high as 150 MB during loading and
> settles to 57 MB after loading.  This is still far higher than the size of
> the file, but not unreasonable given the typical load of the virtual
> machine.
>

Given that memory usage used to be so much lower, there may be a bona fide
bug in the new version. We will investigate and keep you posted.

If you really need a large file to test, we will have to arrange some way
> to transfer a 7 gig file over the network.
>

Thanks, we do have some large sample LIF files already (one ~7GB, and one
~6GB) so we can start with those. We will let you know if we need
additional samples.

Regards,
Curtis


On Tue, Nov 8, 2011 at 4:49 PM, Unruh, Jay <[hidden email]> wrote:

> Curtis,
>
> Sorry about the delay, but I finally got around to testing this.  I tried
> out two lif files.  One with a total size of 1.8 GB and another with a
> total size of 6.8 GB overall.
>
> The 1.8 GB file worked well under all operating conditions, even on 32-bit
> java (1024 MB allocated) when loading an individual image as big as 900 MB.
>
> The second 6.8 GB file would not load under 32-bit Java even with a
> virtual stack.  The individual file I was loading was only 4 MB.  I ran
> under 64-bit java with 4 gigs of ram allocated and the file loaded fine,
> but an examination of the memory usage revealed that it spiked to 1500 MB
> during loading and stayed at 650 GB after the file loaded, even when I
> explicitly ran the garbage collector either by clicking the status bar or
> running my own GC plugin.  The memory usage returned to almost zero after I
> closed the Image.  This suggests that it isn't a strict "memory leak" but
> perhaps an static object that isn't cleared after loading.  Note that I am
> using the last stable build from the website.
>
> When I load the loci_tools.jar file generated on June 10, 2011 (this is
> version 4.2.2) the memory only goes as high as 150 MB during loading and
> settles to 57 MB after loading.  This is still far higher than the size of
> the file, but not unreasonable given the typical load of the virtual
> machine.
>
> I hope this helps.  If you really need a large file to test, we will have
> to arrange some way to transfer a 7 gig file over the network.  I can't
> share that exact file as it belongs to a user, but I could generate a file
> of similar size for testing if you need it.
>
> Below is the error message that was generated loading the 6.8 GB file with
> 32-bit java.
>
> Jay
>
> java.lang.OutOfMemoryError: Java heap space
>        at java.util.Arrays.copyOf(Unknown Source)
>        at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
>        at java.lang.AbstractStringBuilder.append(Unknown Source)
>        at java.lang.StringBuffer.append(Unknown Source)
>        at java.io.StringWriter.write(Unknown Source)
>        at
> com.sun.org.apache.xml.internal.serializer.ToStream.characters(Unknown
> Source)
>        at
> com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> Source)
>        at
> com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transformIdentity(Unknown
> Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> Source)
>        at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> Source)
>        at loci.common.xml.XMLTools.getXML(XMLTools.java:161)
>        at
> loci.formats.services.OMEXMLServiceImpl.getOMEXML(OMEXMLServiceImpl.java:337)
>        at
> loci.plugins.in.ImagePlusReader.createFileInfo(ImagePlusReader.java:516)
>        at
> loci.plugins.in.ImagePlusReader.readImage(ImagePlusReader.java:304)
>        at
> loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:236)
>        at
> loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:214)
>        at
> loci.plugins.in.ImagePlusReader.openImagePlus(ImagePlusReader.java:112)
>        at loci.plugins.in.Importer.readPixels(Importer.java:134)
>        at loci.plugins.in.Importer.run(Importer.java:87)
>        at loci.plugins.LociImporter.run(LociImporter.java:79)
>        at ij.IJ.runUserPlugIn(IJ.java:183)
>        at ij.IJ.runPlugIn(IJ.java:150)
>
> -----Original Message-----
> From: ImageJ Interest Group [mailto:[hidden email]] On Behalf Of
> Curtis Rueden
> Sent: Friday, November 04, 2011 11:02 AM
> To: [hidden email]
> Subject: Re: Flush/clear memory in ImageJ/Fiji
>
> Hi Bahram,
>
> I have the exact same problem with Fiji and .lif files:
> >
>
> Unfortunately, I am unable to replicate the problem with today's (fully
> updated) version of Fiji on a Windows 7 64-bit system with 2GB, with 1.5GB
> allocated to Fiji.
>
> I tried repeatedly opening a ~1.1GB LIF file, both with and without the
> "Use virtual stack" option checked. After the image windows opened, I
> browsed repeatedly browsed back and forth through time, then closed all
> windows. Each time, the memory use returned to the baseline level.
>
> Perhaps the problem occurs only with certain LIF files? Or only with LIF
> files beyond a certain size? Is it possible to reliably reproduce with a
> macro or script on your systems? If so, would you be willing to send such a
> macro or script and/or sample data that illustrates the issue?
>
> Thanks,
> Curtis
>
>
> On Fri, Nov 4, 2011 at 7:48 AM, Bahram <[hidden email]
> >wrote:
>
> > Hi Curtis
> > I have the exact same problem with Fiji and .lif files:
> >
> > My system:
> > Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz Loci
> > revision  281b1fa, 16.2.2011, release 4.3 -DEV ImageJA 1.45b, 18405
> > MBs RAM allocated to ImageJ
> >
> > This happens every single time i open a lif (typically very large
> > time-lapse sequences), so now i just reopen fiji after each movie.
> >
> > cheers
> > Bahram
> >
> > --
> > View this message in context:
> > http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-t
> > p6907896p6962634.html Sent from the ImageJ mailing list archive at
> > Nabble.com.
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Melissa Linkert-2
Hi Jay,

> The second 6.8 GB file would not load under 32-bit Java even with a virtual
> > stack.
> >
>
> Thanks for the additional information. My guess is that the Bio-Formats
> file reader is parsing a large amount of metadata into OME-XML (a fairly
> inefficient structure in some ways). Even with virtual stacks on, we still
> cache the metadata for the entire dataset. This can become expensive when
> the data has per-plane timestamps or stage positions with a large number of
> planes total.
>
> We are working on a refactoring of Bio-Formats that will greatly reduce
> this problem, but in the meantime, I am not sure if there is anything you
> can do about it. I am CCing the rest of the Bio-Formats team in case they
> have any additional insight.
>
> When I load the loci_tools.jar file generated on June 10, 2011 (this is
> > version 4.2.2) the memory only goes as high as 150 MB during loading and
> > settles to 57 MB after loading.  This is still far higher than the size of
> > the file, but not unreasonable given the typical load of the virtual
> > machine.
> >
>
> Given that memory usage used to be so much lower, there may be a bona fide
> bug in the new version. We will investigate and keep you posted.
>
> If you really need a large file to test, we will have to arrange some way
> > to transfer a 7 gig file over the network.
> >
>
> Thanks, we do have some large sample LIF files already (one ~7GB, and one
> ~6GB) so we can start with those. We will let you know if we need
> additional samples.

I was not able to entirely duplicate this problem using the files that
Curtis mentions, but I was able to substantially reduce the amount of
memory used in any case.  If you update to the very latest trunk build of
Bio-Formats, you will hopefully see an improvement.

If you find that the amount of memory used is still prohibitively large,
please let us know and we will provide information on how to send a file
privately.

Regards,
-Melissa

On Wed, Nov 16, 2011 at 02:58:58PM -0600, Curtis Rueden wrote:

> Hi Jay,
>
> The second 6.8 GB file would not load under 32-bit Java even with a virtual
> > stack.
> >
>
> Thanks for the additional information. My guess is that the Bio-Formats
> file reader is parsing a large amount of metadata into OME-XML (a fairly
> inefficient structure in some ways). Even with virtual stacks on, we still
> cache the metadata for the entire dataset. This can become expensive when
> the data has per-plane timestamps or stage positions with a large number of
> planes total.
>
> We are working on a refactoring of Bio-Formats that will greatly reduce
> this problem, but in the meantime, I am not sure if there is anything you
> can do about it. I am CCing the rest of the Bio-Formats team in case they
> have any additional insight.
>
> When I load the loci_tools.jar file generated on June 10, 2011 (this is
> > version 4.2.2) the memory only goes as high as 150 MB during loading and
> > settles to 57 MB after loading.  This is still far higher than the size of
> > the file, but not unreasonable given the typical load of the virtual
> > machine.
> >
>
> Given that memory usage used to be so much lower, there may be a bona fide
> bug in the new version. We will investigate and keep you posted.
>
> If you really need a large file to test, we will have to arrange some way
> > to transfer a 7 gig file over the network.
> >
>
> Thanks, we do have some large sample LIF files already (one ~7GB, and one
> ~6GB) so we can start with those. We will let you know if we need
> additional samples.
>
> Regards,
> Curtis
>
>
> On Tue, Nov 8, 2011 at 4:49 PM, Unruh, Jay <[hidden email]> wrote:
>
> > Curtis,
> >
> > Sorry about the delay, but I finally got around to testing this.  I tried
> > out two lif files.  One with a total size of 1.8 GB and another with a
> > total size of 6.8 GB overall.
> >
> > The 1.8 GB file worked well under all operating conditions, even on 32-bit
> > java (1024 MB allocated) when loading an individual image as big as 900 MB.
> >
> > The second 6.8 GB file would not load under 32-bit Java even with a
> > virtual stack.  The individual file I was loading was only 4 MB.  I ran
> > under 64-bit java with 4 gigs of ram allocated and the file loaded fine,
> > but an examination of the memory usage revealed that it spiked to 1500 MB
> > during loading and stayed at 650 GB after the file loaded, even when I
> > explicitly ran the garbage collector either by clicking the status bar or
> > running my own GC plugin.  The memory usage returned to almost zero after I
> > closed the Image.  This suggests that it isn't a strict "memory leak" but
> > perhaps an static object that isn't cleared after loading.  Note that I am
> > using the last stable build from the website.
> >
> > When I load the loci_tools.jar file generated on June 10, 2011 (this is
> > version 4.2.2) the memory only goes as high as 150 MB during loading and
> > settles to 57 MB after loading.  This is still far higher than the size of
> > the file, but not unreasonable given the typical load of the virtual
> > machine.
> >
> > I hope this helps.  If you really need a large file to test, we will have
> > to arrange some way to transfer a 7 gig file over the network.  I can't
> > share that exact file as it belongs to a user, but I could generate a file
> > of similar size for testing if you need it.
> >
> > Below is the error message that was generated loading the 6.8 GB file with
> > 32-bit java.
> >
> > Jay
> >
> > java.lang.OutOfMemoryError: Java heap space
> >        at java.util.Arrays.copyOf(Unknown Source)
> >        at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> >        at java.lang.AbstractStringBuilder.append(Unknown Source)
> >        at java.lang.StringBuffer.append(Unknown Source)
> >        at java.io.StringWriter.write(Unknown Source)
> >        at
> > com.sun.org.apache.xml.internal.serializer.ToStream.characters(Unknown
> > Source)
> >        at
> > com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> > Source)
> >        at
> > com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> > Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transformIdentity(Unknown
> > Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> > Source)
> >        at
> > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> > Source)
> >        at loci.common.xml.XMLTools.getXML(XMLTools.java:161)
> >        at
> > loci.formats.services.OMEXMLServiceImpl.getOMEXML(OMEXMLServiceImpl.java:337)
> >        at
> > loci.plugins.in.ImagePlusReader.createFileInfo(ImagePlusReader.java:516)
> >        at
> > loci.plugins.in.ImagePlusReader.readImage(ImagePlusReader.java:304)
> >        at
> > loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:236)
> >        at
> > loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:214)
> >        at
> > loci.plugins.in.ImagePlusReader.openImagePlus(ImagePlusReader.java:112)
> >        at loci.plugins.in.Importer.readPixels(Importer.java:134)
> >        at loci.plugins.in.Importer.run(Importer.java:87)
> >        at loci.plugins.LociImporter.run(LociImporter.java:79)
> >        at ij.IJ.runUserPlugIn(IJ.java:183)
> >        at ij.IJ.runPlugIn(IJ.java:150)
> >
> > -----Original Message-----
> > From: ImageJ Interest Group [mailto:[hidden email]] On Behalf Of
> > Curtis Rueden
> > Sent: Friday, November 04, 2011 11:02 AM
> > To: [hidden email]
> > Subject: Re: Flush/clear memory in ImageJ/Fiji
> >
> > Hi Bahram,
> >
> > I have the exact same problem with Fiji and .lif files:
> > >
> >
> > Unfortunately, I am unable to replicate the problem with today's (fully
> > updated) version of Fiji on a Windows 7 64-bit system with 2GB, with 1.5GB
> > allocated to Fiji.
> >
> > I tried repeatedly opening a ~1.1GB LIF file, both with and without the
> > "Use virtual stack" option checked. After the image windows opened, I
> > browsed repeatedly browsed back and forth through time, then closed all
> > windows. Each time, the memory use returned to the baseline level.
> >
> > Perhaps the problem occurs only with certain LIF files? Or only with LIF
> > files beyond a certain size? Is it possible to reliably reproduce with a
> > macro or script on your systems? If so, would you be willing to send such a
> > macro or script and/or sample data that illustrates the issue?
> >
> > Thanks,
> > Curtis
> >
> >
> > On Fri, Nov 4, 2011 at 7:48 AM, Bahram <[hidden email]
> > >wrote:
> >
> > > Hi Curtis
> > > I have the exact same problem with Fiji and .lif files:
> > >
> > > My system:
> > > Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz Loci
> > > revision  281b1fa, 16.2.2011, release 4.3 -DEV ImageJA 1.45b, 18405
> > > MBs RAM allocated to ImageJ
> > >
> > > This happens every single time i open a lif (typically very large
> > > time-lapse sequences), so now i just reopen fiji after each movie.
> > >
> > > cheers
> > > Bahram
> > >
> > > --
> > > View this message in context:
> > > http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-t
> > > p6907896p6962634.html Sent from the ImageJ mailing list archive at
> > > Nabble.com.
> > >
> >
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Yael Grossman
In reply to this post by ctrueden
I have been having a similar problem using the Foci 3D counter in ImageJ.  I try to run it on a tif virtual stack of images, but after about the fourth, I start getting an error message about java heap space.  Then I close Java and have to wait about 30-45 min before the heap space resets.  I am closing each image after I finish with the foci counter.  Since I have to run about two hundred of these images per set, ideally I would like to be able to have a macro command that I can add to a batch code I already wrote to clean out the heap after every image.  Does anything like this exist as a command that can be added to a macro?

I am working on a 64 bit java 4GB RAM Mac OS 8.  I just got 16 GB of RAM to upgrade, but I don't think that will completely fix the problem, just delay the onset.

Thank you,
Yael
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

Yael Grossman
In reply to this post by ctrueden
I have been having a similar problem using the Foci 3D counter in ImageJ.  I try to run it on a tif virtual stack of images, but after about the fourth, I start getting an error message about java heap space.  Then I close Java and have to wait about 30-45 min before the heap space resets.  I am closing each image after I finish with the foci counter.  Since I have to run about two hundred of these images per set, ideally I would like to be able to have a macro command that I can add to a batch code I already wrote to clean out the heap after every image.  Does anything like this exist as a command that can be added to a macro?

I am working on a 64 bit java 4GB RAM Mac OS 8.  I just got 16 GB of RAM to upgrade, but I don't think that will completely fix the problem, just delay the onset.

Thank you,
Yael
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

dscho
Hi Yael,

On Mon, 4 Mar 2013, Yael Grossman wrote:

> I have been having a similar problem using the Foci 3D counter in
> ImageJ.  I try to run it on a tif virtual stack of images, but after
> about the fourth, I start getting an error message about java heap
> space.  Then I close Java and have to wait about 30-45 min before the
> heap space resets.  I am closing each image after I finish with the foci
> counter.  Since I have to run about two hundred of these images per set,
> ideally I would like to be able to have a macro command that I can add
> to a batch code I already wrote to clean out the heap after every image.
> Does anything like this exist as a command that can be added to a macro?

You can force a garbage collection with this macro call:

        call("java.lang.System.gc");

However, my hunch is that the Foci 3D counter does not release memory
correctly. That would need a change in the code and is not as easy as
calling a macro.

Ciao,
Johannes

--
ImageJ mailing list: http://imagej.nih.gov/ij/list.html
Reply | Threaded
Open this post in threaded view
|

Re: Flush/clear memory in ImageJ/Fiji

kazuya
Hi

I use blow two command to reduce memory in Image J macro.
Sometimes they are effective. But I don't know why.

run("Reset...", "reset=[Undo Buffer]");
run("Reset...", "reset=[Locked Image]");