Login  Register

Re: Processing huge files

Posted by ctrueden on Mar 19, 2015; 7:19pm
URL: http://imagej.273.s1.nabble.com/Processing-huge-files-tp5012001p5012070.html

Hi Greg,

> what I want is to write as Stack out slice by slice to one big file.

A couple of suggestions to try.

Firstly, have you tried the "bfconvert" command line tool [1]?

It would allow you to convert your ND2 files to (Big-)TIFF without needing
ImageJ at all. It supports BigTIFF via the "-bigtiff" flag.

Secondly, you can use the Bio-Formats API to write out the data plane by
plane [2] to a single large file (TIFF or otherwise). You would need to
write Java code, or use a non-macro scripting language that can access Java
API (e.g., Jython or Groovy) [3].

Regards,
Curtis

P.S. Some day ImageJ2 will be far enough along that the paradigm will
shift: "opening" the data will always be virtual, with ImageJ not needing
to read all planes in advance; processing the data will page planes in and
out of RAM as needed transparently; and saving/export will work as usual,
with very little RAM overhead since it is also done fully plane-by-plane.
We are getting close, but not quite there yet...

[1]
http://openmicroscopy.org/site/support/bio-formats/users/comlinetools/conversion.html
[2]
http://openmicroscopy.org/site/support/bio-formats/developers/export.html
[3] http://imagej.net/Scripting


On Wed, Mar 18, 2015 at 5:25 AM, Greg <[hidden email]> wrote:

> Hi Curtis,
>
> thx for your answer and sorry for my delay. So of course writing out slice
> by slice works, if I write every slice as a new file. But what I want is to
> write as Stack out slice by slice to one big file. So in principle to save
> RAM I want to do the following:
>
> open one slice from Virtual Stack
>
> do some processing
>
> Append that slice to a disk resident Stack.
>
> So that only one slice has to be in memory at a time, but in the end I have
> a processed disk resident copy of the huge input Virtual Stack.
>
> Best,
> Greg
>
>
>
>
>
> --
> View this message in context:
> http://imagej.1557.x6.nabble.com/Processing-huge-files-tp5012001p5012032.html
> Sent from the ImageJ mailing list archive at Nabble.com.
>
> --
> ImageJ mailing list: http://imagej.nih.gov/ij/list.html
>

--
ImageJ mailing list: http://imagej.nih.gov/ij/list.html