Login  Register

Processing huge files

Posted by Greg on Mar 16, 2015; 7:30pm
URL: http://imagej.273.s1.nabble.com/Processing-huge-files-tp5012001.html

Hi,

I want to convert huge .nd2 files (around 60Gb each) to tif, as all operations are faster on the tif format.
The conversion itself is quite painful, as coming from this thread:

http://imagej.1557.x6.nabble.com/Virtual-Memory-in-Headless-Mode-td5011730.html#a5011733

this bug seems not to be fixed, as I still get a Java heap memory error when trying to do the conversion headless. So the conversion needs to be done with the Fiji GUI open.

But nevertheless, the problem is the same also with the hopefully soon working headless mode: having like 40Gb of virtual memory slows down (any?) my PC quite hard. So right now I can open the huge file slice by slice via VirtualStack, but I can not write them slice by slice. So that means I first have to load all sclices into memory (=a Stack) before writing them as multi page tif to disk.

So my question is, is it somehow possible to append a single slice to a disk resident file (Virtual Stack) ?
A workaround might be to write out single tifs for every slice and then somehow do the tif concatenation, but I have no idea how exactly this can be done.

Greets,
Greg