hi,
I want to start the often discussed "memory limit" discussion in the light of 64-bit processors again. we have tested ImageJ on a 64-bit Intel P4 with Windows XP 64-bit (Windows 2003 Server) on board and after an *extremely* painful search we found the Java 64-bit for Windows (search the web for 'jre-1_5_0_11-windows-amd64.exe' which perfectly works for Intel's P4 64-bit and should for Core(2)Duo as well). we used the latest ImageJ 1.38n and modified ImageJ.cfg to run under the new 64-bit JVM. so far we could allocate 3GB (-Xmx3000m) on a machine with 2GB physically (which will be upgraded soon), but ImageJ file import crashed after 1.85GB (instead of 1.7GB) with the "out of memory" exception (both 'About ImageJ' and TaskManager showed an allocation about 3GB). we will test and report further. does anybody know why ImageJ has a general 4GB limit? I just guess that any memory offset used in ImageJ is a good old 32-bit unsigned integer. if so, are there plans to upgrade ImageJ for the 64-bit era? (browsing of *large* datasets acquired by 5D microscopy is still one of the major limitations in the life-sciences) hope to start a discussion here... cheers, michael -- Michael Held, Dipl.-Inf. ETH Zurich Institute of Biochemistry HPM E17, Schafmattstrasse 18 8093 Zuerich, Switzerland Phone: +41 44 632 3148, Fax: +41 44 632 1269 [hidden email] |
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1 Hello, I wasn't aware of this. I just tried it and you are right. I run ImageJ on a 64bit winxp with 8GB memory. I allocated 6GB for ImageJ. I get the out of memory error when 4261Mb are used. I thought only the jvm would be concerned with memory access. Volker Michael Held a écrit : > hi, > > I want to start the often discussed "memory limit" discussion in the > light of 64-bit processors again. > > we have tested ImageJ on a 64-bit Intel P4 with Windows XP 64-bit > (Windows 2003 Server) on board and after an *extremely* painful search > we found the Java 64-bit for Windows (search the web for > 'jre-1_5_0_11-windows-amd64.exe' which perfectly works for Intel's P4 > 64-bit and should for Core(2)Duo as well). > we used the latest ImageJ 1.38n and modified ImageJ.cfg to run under the > new 64-bit JVM. > > so far we could allocate 3GB (-Xmx3000m) on a machine with 2GB > physically (which will be upgraded soon), but ImageJ file import crashed > after 1.85GB (instead of 1.7GB) with the "out of memory" exception (both > 'About ImageJ' and TaskManager showed an allocation about 3GB). > > we will test and report further. > > does anybody know why ImageJ has a general 4GB limit? I just guess that > any memory offset used in ImageJ is a good old 32-bit unsigned integer. > if so, are there plans to upgrade ImageJ for the 64-bit era? > (browsing of *large* datasets acquired by 5D microscopy is still one of > the major limitations in the life-sciences) > > hope to start a discussion here... > cheers, > michael > Version: GnuPG v1.4.6 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFGE6UpxZKX7A/4oMERAp25AKCvFtAoeimXgq081JDnkl8tinmROACgy3bF sNttxkExTwYsxv9FHljw6tE= =2EFh -----END PGP SIGNATURE----- -- passerelle antivirus du campus CNRS de Montpellier -- |
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1 When I use the vm option -XX:+AggressiveHeap I get the out of memory at 5718Mb Volker Volker Bäcker a écrit : > Hello, > I wasn't aware of this. I just tried it and you are right. > I run ImageJ on a 64bit winxp with 8GB memory. > I allocated 6GB for ImageJ. I get the out of memory error when 4261Mb > are used. > I thought only the jvm would be concerned with memory access. > Volker > > Michael Held a écrit : >> hi, > >> I want to start the often discussed "memory limit" discussion in the >> light of 64-bit processors again. > >> we have tested ImageJ on a 64-bit Intel P4 with Windows XP 64-bit >> (Windows 2003 Server) on board and after an *extremely* painful search >> we found the Java 64-bit for Windows (search the web for >> 'jre-1_5_0_11-windows-amd64.exe' which perfectly works for Intel's P4 >> 64-bit and should for Core(2)Duo as well). >> we used the latest ImageJ 1.38n and modified ImageJ.cfg to run under the >> new 64-bit JVM. > >> so far we could allocate 3GB (-Xmx3000m) on a machine with 2GB >> physically (which will be upgraded soon), but ImageJ file import crashed >> after 1.85GB (instead of 1.7GB) with the "out of memory" exception (both >> 'About ImageJ' and TaskManager showed an allocation about 3GB). > >> we will test and report further. > >> does anybody know why ImageJ has a general 4GB limit? I just guess that >> any memory offset used in ImageJ is a good old 32-bit unsigned integer. >> if so, are there plans to upgrade ImageJ for the 64-bit era? >> (browsing of *large* datasets acquired by 5D microscopy is still one of >> the major limitations in the life-sciences) > >> hope to start a discussion here... >> cheers, >> michael > Version: GnuPG v1.4.6 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFGE6v8xZKX7A/4oMERAimaAKDoVy27jCMz3CoetHw5MGC7uyKBuACfcPxp SNOuvesPIKzIFt+sDHFTlc0= =ozmG -----END PGP SIGNATURE----- -- passerelle antivirus du campus CNRS de Montpellier -- |
In reply to this post by Michael Held
Michael,
> (browsing of *large* datasets acquired by 5D microscopy is still one of > the major limitations in the life-sciences) In my experience having all images opened at once doesn't help much. What does help is virtualization: either purely virtual stacks like those offered by the virtual stack opener plugin, or a virtualization layer with a fifo on-demand reloading cache like TrakEM2. I don't have experience with 5D, but for 4D we regularly open 15.000 16-bit images (from SPIM microscope) with the Virtual Stack Opener, and then apply the 4D hypervolume browser on it without problems. Scrolling through the stack is fast, almost unnoticeable (even when the files physically live on a gigabit-networked remote file server). ImageJ doesn't need more than a few hundred megabytes to handle the above seamlessly. As for internal ImageJ-related 64-bit limitations: I am not aware of any. Only if single images are bigger than the 32-bit limit, or more than such 32-bit images are open, may you see strange behaviours. The JVM crashing after allocating -Xmx3000 for it and then just opening the 'About ImageJ' sounds to me like a JVM problem, not an ImageJ problem. Albert -------------------------------------------------------------------- This message was sent using Webmail@INI: https://webmail.ini.ethz.ch |
hi albert,
thanks for your reply from the ETH :-) Albert Cardona wrote: > In my experience having all images opened at once doesn't help much. What does > help is virtualization: either purely virtual stacks like those offered by the > virtual stack opener plugin, or a virtualization layer with a fifo on-demand > reloading cache like TrakEM2. I will have a closer look on TrakEM2! > I don't have experience with 5D, but for 4D we regularly open 15.000 16-bit > images (from SPIM microscope) with the Virtual Stack Opener, and then apply the > 4D hypervolume browser on it without problems. Scrolling through the stack is > fast, almost unnoticeable (even when the files physically live on a > gigabit-networked remote file server). ImageJ doesn't need more than a few > hundred megabytes to handle the above seamlessly. manual browsing and annotation is the major reason why we have to import the *entire* stack. because of the volume most of our data is stored on a NAS which will make image browsing slow. > As for internal ImageJ-related 64-bit limitations: I am not aware of any. Only > if single images are bigger than the 32-bit limit, or more than such 32-bit > images are open, may you see strange behaviours. that's very good to know! I was just concerned, because the ImageJ guys write about 4GB limit. > The JVM crashing after allocating -Xmx3000 for it and then just opening the > 'About ImageJ' sounds to me like a JVM problem, not an ImageJ problem. I test the JVM6.0 for it. thanks! happy easter michael |
Free forum by Nabble | Edit this page |