Hi,
I'm a student, here's my problem:
I have to create a matrix "P" with N = 100 rows and T = 25 columns which describes the value of a portfolio for 25 years. Every year the spending of the portfolio holder should be the arithmetic mean of the last three years portfolio's value. This is inside a for loop. Below i show you my codes.
mu = 0.07;sigma = .15;N = 100; % Number of simulation
T = 25; % years
% Simulation of expected returns
randn('state',10);r = [ones(N,1)*NaN, mu + randn(N,T)*sigma];% Matrix containing portfolio values and spending values
P = NaN*ones(N,T+1); % Portfolio Matrix
P(1:N,1) = 2000; % Initial Value of Portfolio
E = NaN*ones(N,T+1); % Expenditure Matrix
spendingrate = 0.05;
The problem is here:
for t = 1:T if t < 3 P(1:N,t+1) = max(0,P(1:N,t).*exp(r(1:N,t+1)) - spendingrate*P(1:N,t)); E(1:N,t+1) = min(P(1:N,t).*exp(r(1:N,t+1)), spendingrate*P(1:N,t+1)); else t > 3 P(1:N,t+1) = max(0,P(1:N,t).*exp(r(1:N,t+1)) - spendingrate*(mean(P(1:N,t:t-2),2))); E(1:N,t+1) = min(P(1:N,t).*exp(r(1:N,t+1)), spendingrate*(mean(P(1:N,t:t-2),2))); endend
I'm not able to make it run, calculating for each year, the mean of Portfolio values of the past three years. This not allow me to calculate the Value of the portfolio. Maybe is because the values of which I need to make the mean are generated in the same loop. Can you help me to fix this?
Best Answer