Let's concentrate on non-interactive plots first. This generates a file of size ~229MB, which isn't all that big but you've expressed that you'd like to go to even larger files, so you'll hit memory limits eventually. Scipy.io.numpyio.fwrite(fd,data.size,data) Matplotlib has lots of options and the output is fine, but it's a huge memory hog and it fundamentally assumes your data is small. So your data isn't that big, and the fact that you're having trouble plotting it points to issues with the tools. Something like plt.swap_on_disk() could cache the stuff on my SSD ) Plt.plotfile(fname, cols=(0,1)) # index vs. Unpacked_file = unpack_set("test01.cfile", "test01.txt") # reformats output (precision configuration here) column to a variableīyte = f.read(4) # next row of the vector and read 1. column of the vectorįloatq = struct.unpack('f', byte) # write value of 2. column to a variableīyte = f.read(4) # read 2. column of the vectorįloati = struct.unpack('f', byte) # write value of 1. Output_filename = open(output_filename, 'wb')īyte = f.read(4) # read 1. Note: directly plotting with numpy results into shadowed functionsĭef unpack_set(input_filename, output_filename): Txt - index,in-phase,quadrature in plaintext import matplotlib.pyplot as pltĬfile - IEEE single-precision (4-byte) floats, IQ pairs, binary GNUplot fails, with a similar approach to the following. But I have no experience with other software.
Is there any optimization potential to this, or another software/programming language (like R or so) which can handle larger data-sets? Actually I want much more data in my plots. I need the plot zoomable, and interactive. My framework (GNU Radio) saves the values (to avoid using too much disk space) in binary. Is there any solution to avoid that "shadowing" of my data-set?Ĭoncretely I deal with Digital Signal Processing and I have to use a high sample-rate. I have got a problem (with my RAM) here: it's not able to hold the data I want to plot.