MATLAB: How to prevent MATLAB from running out of memory when running a large number of functions with EVAL that generate a large amount of data in MATLAB 7.8 (R2009a)

clearevalfunctionsleakMATLABmemory

I was trying to prevent a memory leak in my MATLAB program where I was trying to load a large amount of data from a group of sequential files that were structured as MATLAB files. These variables that were read in, overwrote the values that were previously in memory, but there was still a memory leak that was occuring. I cannot use a 'clear all' since I need to retain some information in memory and clearing all used variables by my MATLAB code individually did not work.
For example, use the code below to process the data file attached to this solution as a resolution document:
%%test_loading.m
num_files = 900;
for loops = 1:num_files
if loops >= 2
old_file = ['test_data',num2str(loops-1),'.m'];
new_file = ['test_data',num2str(loops),'.m'];
copyfile(old_file,new_file);
end
file_name = ['test_data',num2str(loops),'.m'];
name_length = length(file_name);
eval(file_name(1:name_length-2));
disp(['Analyzing ',file_name]);
if loops > 3,
delete(old_file);
end
end

Best Answer

To prevent a memory leak in such a situation, use 'clear functions' periodically in your code. This is necessary since MATLAB stores some information about MATLAB files and calling 'clear functions' clears up this memory space.
For example, you can modify the previous code example as follows:
%%test_loading.m
num_files = 900;
for loops = 1:num_files
if loops >= 2
old_file = ['test_data',num2str(loops-1),'.m'];
new_file = ['test_data',num2str(loops),'.m'];
copyfile(old_file,new_file);
end
file_name = ['test_data',num2str(loops),'.m'];
name_length = length(file_name);
eval(file_name(1:name_length-2));
disp(['Analyzing ',file_name]);
if loops > 3,
delete(old_file);
end
clear functions;
end