Hi,
How to average 1s long samples of a data set? I have a data in a form t = [0 1 2 3…5000…10000 …10000] d = [5 5 5 4… 1… 6…8 ]
where t represents milliseconds and d represents specific values for each millisecond. I'd like to have t in seconds so then values from d should be averaged every 999ms. I mean e.g. for t=0s d=5, for t=1s, d should be an average of values form 1 to 999ms and so on.
Thanks for help.
Best Answer