Hello, I have a simple problem but I need to improve the execution efficiency when I have thousands of data in a probability density function, a small example:
close allclear%example
x =[ 1 3 5 9 15];y =[ 4 7 8 12 6];xn=[ 3 9 16]; yn=[ 10 3 4];[xnew,ynew]=add2pdf(x,y,xn,yn);xnewynew% function
function [xnew,ynew]=add2pdf(x,y,xn,yn)% sum two probability density function
[nf,nc]=size(xn);xold=x;yold=y; xx=[x xn]; [xv,xp]=sort(xx,'ascend'); dxv=diff(xv); xnew=[xv(1) xv(find(dxv>0)+1)]; ynew=zeros(1,length(xnew)); for i=1:size(y,2), pynew=find(xnew==xold(i)); % <====this instruction it's very slow (find), but I need the pointers
ynew(pynew)=y(i); end for i=1:length(xn), px=find(xnew==xn(i)); ynew(px)=ynew(px)+yn(i); endreturn
Best Answer