will.adams at frycomm.com
Thu Mar 3 16:28:52 CET 2011
On Mar 3, 2011, at 10:13 AM, Peter Davis wrote:
> I'm testing some code I wrote to generate .tex files from XML files. These .tex files are then processed via XeLaTeX to produce PDF. The code seems to be working, in that I get a valid .tex file, which then compiles into a valid PDF.
> However, running one test of 34,500 pages took 10 hours(!) to compile with XeLaTeX. This was on a 3GHz/4Gb Windows 7 Pro machine, using the XeTeX from MiKTeX 2.9.
Is the .tex file local to this machine?
> I expected this job to complete in a matter of minutes, but it basically took 100 times longer than I anticipated. The job uses perhaps a dozen PDFs via \includegraphics, but it uses the sames ones over and over again. It also uses \textpos to position blocks of text arbitrarily on the page.
What about the graphics (are they local to this machine)?
Have you tried processing it w/o including the graphics?
> Can anyone think of reasons why it might be so slow? The .tex file I have is 123Mb, but I can create a smaller one, with just 2 pages, if anyone can help diagnose the performance issue.
How are you defining / calling the fonts? Are you defining instances and re-using them? Calling \fontspec takes a _lot_ of processing.
How long does your 2 page sample take to run? I suspect it has a extraneous code in it (~3,500 bytes per page seems like more than I'd expect).
How many fonts do you have installed on your machine? Have you tried halving the number?
I'd be glad to look at and time the 2pg. sample on my machine.
senior graphic designer
Sphinx of black quartz, judge my vow.
More information about the texhax