Hello,
I am running Matlab 2013b on Windows 7. I have 8 GB RAM memory and a I set the swap file to 20 GB.
I am trying to read a relatively large txt file that is tab separated. The size of the file on the hard disk is a little over 2 GB. There are 6 columns and approx. 64 million rows in the file. The entries are mixed (strings and numbers with some missing values).
At this point I am using:
textscan(fid,repmat('%s',1,6),'delimiter','\t');
It is running for about 4 hours now using about 6.5 GB RAM.
1. I would like to know how can I estimate the time it takes to read the file and the size of the output.
2. After it is done I would like to extract the numerical values from the resulting cell matrix and save that to a .mat file. Any idea how long that would take?
3. Is there any better way of doing this? If I could extract from the file a matrix with the numerical values only (setting everything else to NaN) it would be great.
Thanks!
Best Answer