http://imagej.273.s1.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-tp3682491p3682498.html
Thanks for the additional information. My guess is that the Bio-Formats
inefficient structure in some ways). Even with virtual stacks on, we still
cache the metadata for the entire dataset. This can become expensive when
can do about it. I am CCing the rest of the Bio-Formats team in case they
> settles to 57 MB after loading. This is still far higher than the size of
bug in the new version. We will investigate and keep you posted.
~6GB) so we can start with those. We will let you know if we need
> Curtis,
>
> Sorry about the delay, but I finally got around to testing this. I tried
> out two lif files. One with a total size of 1.8 GB and another with a
> total size of 6.8 GB overall.
>
> The 1.8 GB file worked well under all operating conditions, even on 32-bit
> java (1024 MB allocated) when loading an individual image as big as 900 MB.
>
> The second 6.8 GB file would not load under 32-bit Java even with a
> virtual stack. The individual file I was loading was only 4 MB. I ran
> under 64-bit java with 4 gigs of ram allocated and the file loaded fine,
> but an examination of the memory usage revealed that it spiked to 1500 MB
> during loading and stayed at 650 GB after the file loaded, even when I
> explicitly ran the garbage collector either by clicking the status bar or
> running my own GC plugin. The memory usage returned to almost zero after I
> closed the Image. This suggests that it isn't a strict "memory leak" but
> perhaps an static object that isn't cleared after loading. Note that I am
> using the last stable build from the website.
>
> When I load the loci_tools.jar file generated on June 10, 2011 (this is
> version 4.2.2) the memory only goes as high as 150 MB during loading and
> settles to 57 MB after loading. This is still far higher than the size of
> the file, but not unreasonable given the typical load of the virtual
> machine.
>
> I hope this helps. If you really need a large file to test, we will have
> to arrange some way to transfer a 7 gig file over the network. I can't
> share that exact file as it belongs to a user, but I could generate a file
> of similar size for testing if you need it.
>
> Below is the error message that was generated loading the 6.8 GB file with
> 32-bit java.
>
> Jay
>
> java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Unknown Source)
> at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> at java.lang.AbstractStringBuilder.append(Unknown Source)
> at java.lang.StringBuffer.append(Unknown Source)
> at java.io.StringWriter.write(Unknown Source)
> at
> com.sun.org.apache.xml.internal.serializer.ToStream.characters(Unknown
> Source)
> at
> com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> Source)
> at
> com.sun.org.apache.xml.internal.serializer.ToUnknownStream.characters(Unknown
> Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.DOM2TO.parse(Unknown Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transformIdentity(Unknown
> Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> Source)
> at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(Unknown
> Source)
> at loci.common.xml.XMLTools.getXML(XMLTools.java:161)
> at
> loci.formats.services.OMEXMLServiceImpl.getOMEXML(OMEXMLServiceImpl.java:337)
> at
> loci.plugins.in.ImagePlusReader.createFileInfo(ImagePlusReader.java:516)
> at
> loci.plugins.in.ImagePlusReader.readImage(ImagePlusReader.java:304)
> at
> loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:236)
> at
> loci.plugins.in.ImagePlusReader.readImages(ImagePlusReader.java:214)
> at
> loci.plugins.in.ImagePlusReader.openImagePlus(ImagePlusReader.java:112)
> at loci.plugins.in.Importer.readPixels(Importer.java:134)
> at loci.plugins.in.Importer.run(Importer.java:87)
> at loci.plugins.LociImporter.run(LociImporter.java:79)
> at ij.IJ.runUserPlugIn(IJ.java:183)
> at ij.IJ.runPlugIn(IJ.java:150)
>
> -----Original Message-----
> From: ImageJ Interest Group [mailto:
[hidden email]] On Behalf Of
> Curtis Rueden
> Sent: Friday, November 04, 2011 11:02 AM
> To:
[hidden email]
> Subject: Re: Flush/clear memory in ImageJ/Fiji
>
> Hi Bahram,
>
> I have the exact same problem with Fiji and .lif files:
> >
>
> Unfortunately, I am unable to replicate the problem with today's (fully
> updated) version of Fiji on a Windows 7 64-bit system with 2GB, with 1.5GB
> allocated to Fiji.
>
> I tried repeatedly opening a ~1.1GB LIF file, both with and without the
> "Use virtual stack" option checked. After the image windows opened, I
> browsed repeatedly browsed back and forth through time, then closed all
> windows. Each time, the memory use returned to the baseline level.
>
> Perhaps the problem occurs only with certain LIF files? Or only with LIF
> files beyond a certain size? Is it possible to reliably reproduce with a
> macro or script on your systems? If so, would you be willing to send such a
> macro or script and/or sample data that illustrates the issue?
>
> Thanks,
> Curtis
>
>
> On Fri, Nov 4, 2011 at 7:48 AM, Bahram <
[hidden email]
> >wrote:
>
> > Hi Curtis
> > I have the exact same problem with Fiji and .lif files:
> >
> > My system:
> > Win 7 Ultimate, 64bit, 24GB RAM, Intel Core i7 X980 3.33GHz Loci
> > revision 281b1fa, 16.2.2011, release 4.3 -DEV ImageJA 1.45b, 18405
> > MBs RAM allocated to ImageJ
> >
> > This happens every single time i open a lif (typically very large
> > time-lapse sequences), so now i just reopen fiji after each movie.
> >
> > cheers
> > Bahram
> >
> > --
> > View this message in context:
> >
http://imagej.588099.n2.nabble.com/Flush-clear-memory-in-ImageJ-Fiji-t> > p6907896p6962634.html Sent from the ImageJ mailing list archive at
> > Nabble.com.
> >
>