So far I used TikZ/PFGplots to plot small potions of (simulated) signals, drawing schemes, etc. but when it comes to actually measured signals (e.g. at 10 kHz), you easily end up with millions of points. Very quick you run into the TeX capacity exceeded, sorry [main memory size=54000000]-error. While the obvious workaround is either to exceed the memory (not really recommended) or use externalize (only useful if you render multiple images), both are not a practical solution for raw data.
While it works fairly well with 100,000 (1e5) rows (each containing x and y value) but breaks if you go much larger. However, assuming 10kHz, this is translates to a signal of 10s maximum...
Q: Is there an efficient way to plot large signals in TeX?
MWE: (assuming the data in a file data.csv with two columns x and y)
\documentclass{article}
\usepackage{pgfplots}
\begin{document}
\begin{tikzpicture}
\begin{axis}
\addplot table [mark=none,x=x, y=y, col sep=comma] {data.csv};
\end{axis}
\end{tikzpicture}
\end{document}
For the file data.csv. Take this file here, or use the snippet of MATLAB code to reproduce the file yourself:
x = (1:1e6).';
y = awgn( ((x-1/3*length(x))/length(x)*10).^2,20,'measured');
writetable(table(x,y),'data.csv');
I was wondering if there is a script to downsample the data but keeping the most significant points automatically. In particular I want to spy on the data, it needs the original resolution in this area...
kintrcould be really tikz plots, like here, here or here – Fran Mar 27 '20 at 17:43pgfplotshas a whole subsection on this: 6.1 Memory Limits of TEX, but it is also true that there are limitations and LaTeX is not necessarily made to plot so large numbers of samples. – Mar 27 '20 at 18:58pgfplots=) – Max Mar 28 '20 at 13:55kintrlibrary in Rstudio (compiling is one in Rstudio, because the file needs to be saved as some kind of markdown file, right?). Is it advisable to compile (its called "knit" there) the whole document in Rstuido, or would I rather compile the figures separately and include them in the original LaTeX document (similar to the way I would calltikzexternalize? – Max Apr 01 '20 at 16:36cache=TRUEoption in R chunks. – Fran Apr 01 '20 at 18:40pgfplotin the first place). Writing and compiling an.Rnwdocument with RStudio is indeed easy once one gets along with R. However, I am still encountering a fatal out of memory error when RStudio callspfdLaTeX:* 54000001 words of memory out of 54000000*. I guess, there is no way around generating figures separately – Max Apr 02 '20 at 14:13matlab2tikzfails with too much data, so you justprintorexportthe figure and plot axes on top of the image – Max May 10 '20 at 06:12matlab2tikzrather than the LaTeX compiler that fails because of memory limitations. And if it doesn't fail, you will get a somewhat useless incredibly heavy pdf which is in contradiction with the principle of vector graphics, which is to get compact yet precise drawings. I really believe this problem is not specific to TeX and friends and just stems from a misuse of vector graphics. – BambOo May 10 '20 at 12:17