MATLAB: How to pass DICOM images to k mean clustering algorithm

Image Processing Toolboximage segmentationStatistics and Machine Learning ToolboxWavelet Toolboxxyz

I have given my code below,I need to do the k-means segmentation using .dcm image,when i run the code I didnt get any errors,but my matlab got stuck.Is there any mistakes in my code,that will make the matlab stuck?
img = dicomread('C:\Users\Desktop\test\hii.dcm');
imshow(img, []);
info = dicominfo('C:\Users\Desktop\test\hii.dcm');
disp(info)
noise_img = imnoise(img,'salt & pepper',0.02);
imshow(noise_img, []);
denoise_img = medfilt2(noise_img);
figure;imshow(denoise_img, []);title('Denoised Image');
[out1, out2, out3, out4] = dwt2(denoise_img,'haar');
trans_img = [out1 out2;out3 out4];
figure;imshow(trans_img,[]);title('Trans_Image');
inv_trans_img = idwt2(out1,out2,out3,out4,'haar');
figure;imshow(inv_trans_img,[]);title('inv_trans_img');
k = 5;
[Centroid,new_cluster]=kmeans_algorithm(inv_trans_img,k);
for i_loop = 1:k
cluster = zeros(size(inv_trans_img));
pos = find(new_cluster==i_loop);
cluster(pos) = new_cluster(pos);
figure; imshow(cluster,[]);title('K-means');
data=cluster;
end
K means implementation code
function [Centroid,new_cluster]=kmeans_algorithm(input_image,k)
input_image=double(input_image);
new_image=input_image;
input_image=input_image(:);
min_val=min(input_image);
input_image=round(input_image-min_val+1);
length_input_image=length(input_image);
max_val=max(input_image)+1;
hist_gram=zeros(1,max_val);
hist_gram_count=zeros(1,max_val);
for i=1:length_input_image
if(input_image(i)>0)
hist_gram(input_image(i))=hist_gram(input_image(i))+1;
end;
end
IDX=find(hist_gram);
hist_length=length(IDX);
Centroid=(1:k)*max_val/(k+1);
while(true)
old_Centroid=Centroid;
for i=1:hist_length
new_val=abs(IDX(i)-Centroid);
hist_val=find(new_val==min(new_val));
hist_gram_count(IDX(i))=hist_val(1);
end
for i=1:k,
loop_count=find(hist_gram_count==i);
Centroid(i)=sum(loop_count.*hist_gram(loop_count))/sum(hist_gram(loop_count));
end
if(Centroid==old_Centroid) break;end;
end
length_input_image=size(new_image);
new_cluster=zeros(length_input_image);
for i=1:length_input_image(1),
for j=1:length_input_image(2),
new_val=abs(new_image(i,j)-Centroid);
loop_count=find(new_val==min(new_val));
new_cluster(i,j)=loop_count(1);
end
end
Centroid=Centroid+min_val-1;

Best Answer

You have
while(true)
old_Centroid=Centroid;
for i=1:hist_length
new_val=abs(IDX(i)-Centroid);
hist_val=find(new_val==min(new_val));
hist_gram_count(IDX(i))=hist_val(1);
end
for i=1:k,
loop_count=find(hist_gram_count==i);
Centroid(i)=sum(loop_count.*hist_gram(loop_count))/sum(hist_gram(loop_count));
end
if(Centroid==old_Centroid) break;end;
end
This is an infinite loop that is terminated only if all of the newly computed centroid values are bit-for-bit identical with the old centroid values. That kind of programming is vulnerable to difficulties with numeric round-off (see http://matlab.wikia.com/wiki/FAQ#Why_is_0.3_-_0.2_-_0.1_.28or_similar.29_not_equal_to_zero.3F), and is vulnerable to overshoot/undershoot problems where the computed values keep cycling around the "true" solution, first one side of the solution then the other side...
You should be testing for equality within a tolerance, and you should have a "fail-safe" maximum iteration check.
(I did not check to see if that is generally correct code for kmeans clustering: the above are reasons why that implementation could fail even if the code is generally correct.)