I have large files that contain interleaved complex data. I reach chunks of it at a time for processing. It seams that there is no way to read this efficiently. I can't find out how to do it without a copy that shouldn't really be necessary. Here are some things that I tried (y.fid points to a file opened with fopen(), y.type is a valid type):
% method 1: read in to a vector, interleave with complex
ticcData = fread(y.fid,numSamp.*2, y.type);t1 = toc;ticcData = complex(cData(1:2:end),cData(2:2:end));t2 = toc;
The above method is slower than it has to be. I think the indexing really slows down the copy into cData.
% method 2: read into 2xN array, use complex and indexing
ticcData = fread(y.fid,[2 numSamp], y.type);t3 = toc;ticcData = complex(cData(1,:),cData(2,:)).';t4 = toc;
The above method is indeed faster. It still requires a copy.
% method 3: read into 2xN array, use complex muliply
ticcData = fread(y.fid,[2 numSamp], y.type);t5 = toc;ticcData = cData.' * [1 ;1j];t6 = toc;
The above is the slowest of them all.
I tried memmap, but it's inappropriate for very large files. I looked into MEX files. It requires that you create a new complex array for every read, but you could probably fill the array efficiently. The problem is that this is worse than an fread to an existing array of the correct size because one has to allocate new memory for every call.
Thanks in advance.
Best Answer