MATLAB: Finding row-averaged magnitude spectrum

digital signal processingfft

Hi, I have an image matrix of 512 x 512 and I'm trying to plot the row-averaged magnitude spectrum of this image. The following code is my first attempt to solve this problem. Can anybody help me whether I'm correct or not? Any help would be appreciated.
load blur.mat
for i=1:512
rows = blur(i,:)
rowsFFT = fft(rows,512)
end
plot(abs(rowsFFT))

Best Answer

Assuming that I understand what you want to do, I would do something like this:
blur = rand(512);
L = size(blur,1); % Row Size
rowsFFT = fft(blur)/L; % Compute ‘fft’ Of Columns, Normalise By Row Size
Fs = 1; % Sampling Frequency (1/Pixel)
Fn = Fs/2; % Nyquist Frequency
Fv = linspace(0, 1, fix(L/2)+1)*Fn; % Frequency Vector
Iv = 1:numel(Fv); % Index Vector
figure
plot(Fv, abs(rowsFFT(Iv,:))*2)
grid
xlabel('Frequency (pixel^{-1})')
ylabel('Amplitude (Units)')
Alternatively:
rowsFFT = fft(blur-mean(blur)); % Remove Constant Offset, Compute ‘fft’ Of Columns, Normalise By Row Size
If this is not what you want, please provide more information.