MATLAB: Using Parallel computing with external software

external softwaregenetic algorithmMATLABobjective-functionparallel computing

Hi. Im trying to use Parallel computing with a function that calls an external software (Ansys). Im using optimization with Genetic Algorithm. But i keep receiving an error. I start the pool with 3 cores.
main code:
clc; clear;
termino=1; % just a parameter
funcao = @(x) Eval_v1_3(x,termino); % objective function with 3 variables
LB = [0.7 1.0 0.5]; % lower bounds
UB = [3.0 3.0 0.7]; % upper bounds
options = optimoptions('ga','MaxGenerations',20,'PopulationSize',10,'UseParallel',true); % sets PARALLEL
x = ga(funcao,3,[],[],[],[],LB,UB,[],options);
objective function:
function val = Eval_v1_3(x,termino)
dlmwrite('param_otim.txt',[x(1) x(2) x(3) termino],'precision','%.5f'); % writes the individual to be read in the middle of Ansys execution
!C:\"Program Files"\"ANSYS Inc"\v150\ansys\bin\winx64\ANSYS150 -b -i "code.txt" -o "output.txt" -np 1 % call Ansys and execute an algorithm that reads param_otim.txt
T_A=load('temps_monit_A.txt','-ascii'); T_B=load('temps_monit_B.txt','-ascii'); % reads the values generated by the simulation
rp = 1; t_sol1 = 1467; t_sol2 = 840; % just parameters
val = rp*sqrt(((t_sol1 -T_A(1))^2 + (t_sol1 -T_A(2))^2 + (t_sol2-T_B(1))^2 + (t_sol2-T_B(2))^2 )); % objective function
dlmwrite('monitoramento.txt',[x T_A T_B -val],'delimiter','\t','-append'); % writes every individual and result for monitoring purpose
end
error that appears:
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
Can i discover which file could not be access? The problem is in Ansys or Matlab? Is there a better way to code the problem (still using GA)?
It works without the parallel, but i would like to speed up with parallel computing.
Thanks!

Best Answer

This week i came back to work with this again, and after some research and tests, this is the answer for the main file:
clc; clear;
spmd
mkdir(sprintf('worker%d', labindex));
copyfile('file1.db',sprintf('worker%d/',labindex));
copyfile('file2.mp',sprintf('worker%d/',labindex));
copyfile('file3.txt',sprintf('worker%d/',labindex));
cd(sprintf('worker%d', labindex));
end
funcao = @(x) ZF_Eval_v1_3_new(x,termino);
LB = [0.7 1.0 0.5];
UB = [3.0 3.0 0.7];
options = optimoptions('ga','MaxGenerations',20,'PopulationSize',10,'UseParallel',true); % sets PARALLEL
x = ga(funcao,3,[],[],[],[],LB,UB,[],options);
and the function file remains the same:
function val = ZF_Eval_v1_3_new(x,termino)
dlmwrite('param_otim.txt',[x(1) x(2) x(3) termino],'precision','%.5f'); % writes the individual to be read in the middle of Ansys execution
!C:\"Program Files"\"ANSYS Inc"\v150\ansys\bin\winx64\ANSYS150 -b -i "file3.txt" -o "output.txt" -np 1
T_A=load('temps_monit_ZF.txt','-ascii');
rp = 1; t_sol1 = 1467; t_sol2 = 840; % just parameters
val = rp*sqrt(((t_sol1 -T_A(1))^2 + (t_sol1 -T_A(2))^2 + (t_sol2-T_A(3))^2 + (t_sol2-T_A(4))^2 )); % objective function
end
The execution time was reduced in 40%. As Walter said, the key was to mkdir and copy the files to the workers folders. I used 3 workers for testing, so it created 3 folders, copying all the files i needed on each folder.
Hope its usefull for someone!