http://imagej.273.s1.nabble.com/WEKA-trainable-segmentation-tp5012651p5012653.html
That's very odd. Which classifier are you using? I've just made a small
> Dear All,
>
> Sorry to post here, but I was not sure how best to get out to the correct
> community!
>
> I have used WEKA trainable segmentation a lot over the last few years,
> mainly for segmenting CT volumes of messy environmental materials, but
> recently, for single slice thin sections. It is an amazing piece of code,
> which has become central to my work.
>
> The problem I have had has only emerged recently, where the classifier
> model file has become huge, i.e. ~5 GB in size, even when the classifier is
> only fairly simple (just one or two filters, and around 20 labelled traces
> in each of three or four classes). Previously, the model files were perhaps
> up to a few hundred MB in size, with far more complex mixtures of filters,
> and much larger training sets (admittedly usually in 2 or 3 classes only).
>
> The effect has been to really slow down the segmentation process, even on
> single 8-bit tiffs only around 4MB in size.
>
> Can anyone explain why the classifier model file sizes have become so
> enormous, and if there is anything I can do to improve things?
>
> Cheers,
>
> Simon
>
> Dr Simon Carr
> School of Geography
> Queen Mary University of London
> Mile End, London, E1 4NS
> United Kingdom
> --
> ImageJ mailing list:
http://imagej.nih.gov/ij/list.html>