Graphics with lots of points are always a challenge for TeX-based processors.
However, I am convinced that both memory and time limitations can be tackled to "reasonable" degree (i.e. to reduced pain).
There are two solutions which should both be considered:
to increase TeX's memory (or to circumvent the limitations of pdflatex).
to reduce the number of times the graphic is being processed by TeX (compile once, use often).
While the comments to your question already indicate some solutions concerning (2.), you may need more input for (1.). In fact, I believe that (1.) is the more pressing issue which cannot easily be solved by (2.).
Concerning (1.), I know that one solution works pretty well: to increase the limits. The pgfplots manual contains details instructions for both windows and linux how to enlarge the memory limits. I consider that to be a mandatory step for you - and invite you to follow the link above and read chapter "6 Memory and Speed Considerations" in the pgfplots
manual. The chapter contains readily deployable configuration examples. It might be that switching to lualatex instead of the conventional tools (pdflatex or latex/dvips) might also solve the memory problem (I do not know).
Concerning (2.), you can use the standalone
package (this site contains a lot of examples). This should work with any of your packages. However, if you use matlab2tikz
, I find the TikZ library external
very useful here - I tailored it to convert each figure to a separate pdf without changing the original document. Note that matlab2tikz
uses pgfplots
, so the link mentioned above might be very useful (it also contains a brief description of this automatic image externalization).
I believe that the steps above should help.
But there are always cases where one might also want to know about alternatives.
Here are some of them. I did not post them directly because I have the impression that you may already have an existing workflow and they may not fit - but perhaps you are interested in my experiences anyway:
a) you could try to implement (selected) figures directly in TeX. I did so by means of pgfplots
which is quite powerful. I like the fact that I could define document-wide consistent styles and that the single documents are, well, often easier to read than autogenerated code. In fact, once I started using pgfplots instead of matlab, I found that both simpler to maintain (.tex files instead of .m files) and prettier. I dropped all of my matlab scripts eventually and used only pgfplots in the end.
b) if your vector graphics are too large, you may want to consider using bitmap graphics and use TeX to overlay axis descriptions over the bitmap. pgfplots
comes with its \addplot graphics
and \addplot3 graphics
commands to streamline the process. You can also post feature requests to Nico Schloemer (author of matlab2tikz) - perhaps he is willing to add automatic bitmap conversion with overlay axes. Details for such an approach can be found in the aforementioned pgfplots
manual (including application examples). Bitmap graphics have the advantage that they render much faster in all viewers - and for surface plots, it does not matter anyway.
From texmf.cnf
:
% Memory. Must be less than 8,000,000 total.
%
% main_memory is relevant only to initex, extra_mem_* only to non-ini.
% Thus, have to redump the .fmt file after changing main_memory; to add
% to existing fmt files, increase extra_mem_*. (To get an idea of how
% much, try \tracingstats=2 in your TeX source file;
% web2c/tests/memtest.tex might also be interesting.)
Thus you have to regenerate the format files (fmtutil
) to see an effect of changing main_memory
.
Best Answer
Since I didn't have any luck with altering the TeX memory capacity, I had to find another way to solve my problem.
It was
matlab2tikz
that actually did the job. It has an option of downsampling the figures during the conversion to tikz-pgf. It is calledminimumPointsDistance
.The final (downsampled) figures where absolutely indistinguishable from the original and MUCH, MUCH more light. Also note, that you don't have to downsample the figures in MATLAB yourself, but just set the
minimumPointsDistance
which only affectsmatlab2tikz
's output code.EDIT: This is an expansion of my answer. I apologise for getting out of the TeX limits, but this extension offers a better solution, so I believe it should be posted here. Now on the problem...
matlab2tikz
's point reduction algorithm although fast and simple has amain disadvantage
It can distort certain "stiff" areas of the curve when the actual number of points kept drops very low. This is illustrated in the following picture
The blue curve (1184 points) is the original while the red one (117 points) is the reduced. The distortion is obvious.
To deal with this and at the same time keep the number of points low I wrote a new point reduction algorithm which is much more effective (but can be slower sometimes). The results (red curve is 116 points)
It is clear that the distortion is gone.
But there still is
one more disadvantage
that exists in both algorithms. When one converts a batch of figures to tikz, they usually want uniform quality for all reduced diagrams. But sometimes this is not possible (at least without extra effort) because typically
minimumPointsDistance
must be set differently for each one of the figures and the results actually depend on the size of the intervals [min_y, max_y] and [min_x, max_x] as well as the number of points in the figure.So in order to overcome this problem and make
minimumPointsDistance
able to be set at a specific value uniformly for all figures, producing reduced versions of uniform quality, normalisation is used, and nowminimumPointsDistance
"internally" refers to the actual size of the printed-on-paper figure.You can download the modified matlab2tikz.m file from here along with instructions.
NOTE that exhaustive tests were NOT made so be careful!