MATLAB: Memory errors with large amounts of csv files

errormemorysave

I am running a model which produces thousands of csv files which i need to read into matlab. This particular run generated 27,178 files.
After 18839 files, matlab gave me an 'out of memory error'. Could anybody provide a solution or more a more effective way of coding this to allow all the files to be included?
Error using readtable (line 216)
Out of memory. Type "help memory" for your options.
filelist = dir('*.csv'); %read all the files in the selected folder
num_files = length(filelist); %record how many files have been found
[~, index] = natsort({filelist.name}); %sort the files into proper numerical order (1,2,3)
filelist = filelist(index);
particledata = cell(length(filelist), 1); %create a cell array the same length as the number of files in one column
%for all the files found in the specified folder, read the tables of data and fill the empty cell array 'results' with the data in each .csv file
for a = 1:num_files
particledata{a} = readtable(filelist(a).name);
end
%% calculate how many particles leave the rice pile
%for each .csv file, calculate the number of particles after a certain y coordinate
ymax = -0.13;
for b = 1:length(particledata)
%save all the rows in the 6th column (y-coordinates) of each cell as a new variable y
y = particledata{b}(:,6);
%use the function table2array to turn the format of the data from a table to an array
y_array = table2array(y);
%sum the total number of grains leaving the rice pile in each cell, and save into a new variable 'grains'
grains(b) = sum(y_array<ymax);
end

Best Answer

Extract the wanted value during the reading. There is no need to store the complete tables.
grains = zeros(1, num_files); % Pre-allocation
for a = 1:num_files
T = readtable(filelist(a).name);
y_array = table2array(T(:, 6));
grains(a) = sum(y_array < ymax);
end