MATLAB: Bug using LinearModel.fit in a loop

for looplinearmodel.fitregression

Everytime i try using my datas and LinearModel.fit it works perfectly, but when i try to use it in a loop i always end up having a matrix (of the correct size) of empty cells… Somebody who has matlab R2013a on a MAC said my said was running perfectly but it doesn't seem to run on a PC with R2012a. Thanks in advance!
V = nchoosek(6,4)
for i = 1:15
A{i}ยจ= V(i,:)
end
for i = 1:15
C{i} = horzcat(A{1,i}{1,1},A{1,i}{1,2},A{1,i}{1,3},A{1,i}{1,4})
end
for i = 1:15
DATA{i} = [C{1,i} Y] %Y is my independent variable
end
for i = 1:15
REG{i} = LinearModel.fit(DATA{1,i})

Best Answer

Well this part doesn't work:
V = nchoosek(6,4)
for i = 1:15
A{i}= V(i,:)
end
because V is a scalar. Start using the debugger, run the following:
dbstop if error
Then run your script. It will stop and you'll be able to inspect what's causing the error.