[tex4ht] what is the fastest way to convert large document to HTML?

Nasser M. Abbasi nma at 12000.org
Sat Aug 18 00:57:43 CEST 2018


On 8/17/2018 5:26 PM, Karl Berry wrote:
> Martin: please see https://tug.org/pipermail/tex4ht/2018q3/001984.html
> and thread following about slow operation handling a huge document (not
> surprising in my view).
> 
> Is it perhaps possible to cache results so that dvisvgm does not have to
> recompute every math fragment every time? Any other possibilities come
> to mind?
> 
> Thanks,
> Karl
> 

I'll make a zip file of the current example document I am working on
to help with this. So any one can see this.

I just need to remove all the \includegraphics from it as they do not
contribute to the issue and to reduce the size of the zip file.

I just need few hrs to do this and will post a link for the zip file.

I bought another PC 3 days ago just for linux and latex use only,
hoping the compile will be much faster than on my windows PC using either
VBox/Linux or cygwin or Windows_10_Linux_subsystem.

(btw, Windows_10_Linux_subsystem is 20% faster than Vbox linux
on same PC. So now I switched to Windows_10_Linux_subsystem instead
of Vbox for Latex compile).

So to compile the file, I copy it from windows PC to Linux
PC on a USB stick and copy the result back to windows.

Native PC using Linux is faster than all the above,
but still too slow compared to using pdflatex or lualatex:

About 3 hrs for htlatex vs. 2-3 minutes lualatex for same file.

90% of the time for htlatex compile is spend in the dvisvgm phase,
processing the DVI file.

Thank you,
--Nasser


More information about the tex4ht mailing list