[Physics] Noise amplitude increases as sample rate increase

noisesignal processing

I am testing the material properties of some very low stiffness materials.

I'm using a force probe connected to software, sensing at about a hundredth of a gram of force.

Now, what's interesting is when my sample rate is 1/sec I get a smooth line, as expected. If I increase the sample rate the line get a more jagged, increased frequency line – still as expected.

BUT the increased sample rate also increases the amplitude of the noise/signal and this I don't understand.

I hope I explained it well enough…

But basically why is it that increased sample rate is increasing the amplitude of the signal rather than just the frequency.

Best Answer

From your description of the experiment (please correct me if my assumptions are wrong), it sounds like your apparatus consists of the application of a controlled stress to the sample (and the sensor), and the resulting strain in the sensor is measured. Whenever the stress applied by your apparatus changes, it will take some time for the system to settle to it's new equilibrium. It could be that sampling at $1 Hz$ is allowing plenty of time for equilibration, but sampling at higher frequencies you are recording the oscillations of the system as it has not yet settled.

One way to test would be to run the experiment without changing the applied force, just recording the strain at various sampling rates, and looking to see if the noise spectrum still depends on the sample rate in the way you describe. If it does, then the noise is a result of the frequency dependence of the electronics. If it does not, then the noise is resulting from the physical behavior of the sample

Related Question