I'm currently writing a script that reads in a CSV file of data generated by a digital o-scope and then manipulates the data (FFT) prior to plotting the manipulated data. The files I'm reading in are usually around 100MB in size. I also need to be flexible to allow the user to specify if there is a header on the file or not. Historically, I've had good results from utilizing the importdata function as it seems to handle large data files well and it gives me the flexibility to specify a number of headerlines to skip over.
Unfortunately, I am currently plagued by an odd issue where some of my coworkers using the script get memory errors of the ilk below when running the script:
Error using importdata (line 213) Unable to load file. Use TEXTSCAN or FREAD for more complex formats.
Error in FFT_current_harmonics_V17 (line 181) inData = importdata([pathname filename], ',', num_headers);
Caused by: Error using fileread (line 36) Out of memory. Type HELP MEMORY for your options.
They're running on a similar computer to mine, with actually more memory than my own, so I'm not certain why they would be running out of memory and I don't. Other coworkers (including myself) have no problem running the script.
Has anyone run into an issue like this before or knows what might be happening. Everyone likes using this script when it works, but it's a real pain for me to have to find less flexible workarounds (load function) for a few people rather than having everyone use the same script version.
Best Answer