p_avg=1; % average optical power
R=1; % photodetector sensitivity
Rb=1; % normalized bit rate
Tb=1/Rb; % bit duration
df=Rb/100; % spectral resolution
f=0:df:5*Rb; % frequency vector
x=f*Tb; % normalized frequency
temp1=(sinc(x)).^2;a=R*p_avg; p=(a^2*Tb).*temp1; %p(1)=p(1)+((a^2)*Tb)*(sinc(0)^2)*(1/Tb); % delta function at DC
p=p/(((p_avg*R)^2)*Tb); % power normalization by energy per bit
plot(p)
The graph is plot as:
But the graph I should get is:
So, which parameter should I change in order to get the correct scale for x-axis?
Best Answer