[tex-live] TeX Live 2008: nasty pdf(la)tex bug on Solaris SPARC with ZFS

Jonathan Kew jonathan at jfkew.plus.com
Sat Sep 13 17:08:10 CEST 2008

On 13 Sep 2008, at 1:21 PM, Vladimir Volovich wrote:

> "KB" == Karl Berry writes:
> KB>         $ xetex -ini fatal: memory exhausted (xmalloc of 72367120
> KB> bytes).
> KB> Well, that at least should be debuggable, e.g., run it under the
> KB> debugger, trap the exit in xmalloc.c, and then run ps on the
> KB> process to see how much it's really using.
> i found the reason: i had this ulimit:
> data seg size         (kbytes, -d) 131072
> when i set ulimit -d unlimited, it was able to start.
> i.e. 128MB are simply not enough to just start xetex -ini
> KB> I don't think xetex is any more aggressive than anything else.
> KB> Everything allocates a ton of stuff these days.
> as i wrote in another email: just running "pdftex -ini \\relax" and
> looking at the process memory usage, i see it uses 6864 Kb of resident
> memory (and 73m VIRT) on my linux box.
> but for "xetex -ini \\relax", i see:
> TL2008: 128m RES, 239m VIRT
> TL2007: 14m RES, 95m VIRT
> why did xetex's memory usage so much increased compared to TL2007?
> (14 MB => 128 MB just for inixetex)

Mainly because of much bigger settings in texmf.cnf, I expect; things  
like main_memory, font_mem_size, max_strings, trie_size, etc have been  
vastly increased. And remember that in many cases, xetex is using  
larger units, so these increases have a proportionately larger effect.  
I haven't done the sums, but I suspect that accounts for a lot of it.

I'd expect that most of the main_memory, etc., never actually gets  
committed in normal use, but the operating system has to be able to  
fulfill the allocation requests, at least in theory. If you don't have  
enough real RAM, it could end up paging badly and performance would be  
awful, but in practice typical *TeX jobs use only a tiny fraction of  
these array sizes.


More information about the tex-live mailing list