Hi,
I'm trying to find the cumulative RMS of a vector. (The data happens to be model error – labelled devCoM for deviation of centre of mass. The calculation should get less accurate as time proceeds, so cumulative error is important.)
I can find the cumulative sum by taking
cumsum(devCoM)
…but this fluctuates about zero.
I can find the RMS of the entire dataset by taking
sqrt(sum(devCoM.^2)
…but this doesn't reveal the expected increase in error over time.
Is there some way I could combine these methods?
Thanks,
Frederick
Best Answer