Hey,
Working on processing a large amount of data (1779945×13). The code I have now grabs the data from text files I've created to compile the data from multiple spreadsheets then the second loop (one with the problem) averages the columns for every row of data. I know you can sometimes use vectorization to replace a for loop but I'm not sure how you could apply that here. Takes over an hour to run now, thoughts to improve speed?
clc; clear;names = ["TM45AA.txt";'TM45AB.txt';'TM45AC.txt';'TM45AC.txt'; ... 'TM45AD.txt';'TM45AE.txt';'TM45AF.txt';'TM45TA.txt';'TM45TB.txt'; ... 'TM45TC.txt';'TM45TD.txt';'TM45TE.txt';'TM45TF.txt'];TM45Data = zeros(1779945,13); open = fopen('cTime.txt','rt'); cTime = fscanf(open,'%f'); fclose(open);for kk = 1:13 open = fopen(names(kk),'rt'); TM45Data(:,kk) = fscanf(open,'%f'); fclose(open);endTM45Avg = [];for ii = 1:1779945 grab = [TM45Data(ii,:)]; avg = mean(grab); TM45Avg = [TM45Avg; avg];end
Best Answer