So I'm doing my homework, and I stumbled across a problem…
my assignment is to write afunction (input A) and returns an othogonal basis for Col(A). Professor wants me to use rref, and the jb row vector that comes with it to build a basis for Col(A). From there we have to use Gram-Schmidt to make the orthogonal basis.
Here is my function:
function[B] = GramSchmidt(A)[m,n] = size(A);[U, jb] = rref(A);x = length(jb); B = zeros(m,x);for i = 1:x C(:,i)= A(:,(jb(i))); endB=C;for i = 2:x for j = 1:i-1 B(:,i) = C(:,i)- dot(C(:,i),B(:,j))/dot(B(:,j),B(:,j))* B(:,j) ; endendend
It works up until the nested forloops… specifically the one on the inside. Following as a computer would (for i=2, for j =1; for i=3, for j =1; for i=3, for j =2)
the problem occurs when i actually loop j. because i think it redefines the matrix, instead of it subtracting to the previous loop.
How do I fix this?
Here is an example that I tried on the current function and its obviously wrong at B(:,3)
[B]=GramSchmidt([1 1 0 2; 1 1 1 2; 2 9 4 4]) B = 1.0000 -2.3333 1.0000 1.0000 -2.3333 2.0000 2.0000 2.3333 3.0000
should i make another matrix to hold the values, then sum it all up a tthe end, and add it to B??
Best Answer