Solved – How to simulate Signal-Noise Ratio

rsignal processingsimulation

I have a vector of simulated data:

data = c(0.47, 0.45, 0.30, 1.15, 0.82, 0.38, 0.51, 1.36, 1.72, 0.36)

I've been adding noise to this by generating random numbers centered at 0 with different standard deviations:

noise = rnorm(10, mean = 0, sd = 0.1)
data_wNoise = data + noise

I've been setting the standard deviation arbitrarily (between 0.001 and 1.5). Is there a better way to simulate this by setting a specific signal-to-noise ratio? I don't know anything about the power of the signal in the data.

Best Answer

Given a model $$ Y = f(X) + \varepsilon $$

The signal to noise ratio can be defined as (ref. ESL10) :

$$ \frac{Var(f(X))}{Var(\varepsilon)} $$

To generate data with a specific signal to noise ratio:

signal_to_noise_ratio = 4
data = c(0.47, 0.45, 0.30, 1.15, 0.82, 0.38, 0.51, 1.36, 1.72, 0.36)
noise = rnorm(data) # generate standard normal errors
k <- sqrt(var(data)/(signal_to_noise_ratio*var(noise)))
data_wNoise = data + k*noise 
Related Question