MATLAB: How i do random sampling to a linear euation by using Monte Carlo simulation

linear modelmonte carlo

here, i will explain the problem in a brief. I have a linear equation with a slope and intercept parameters.I want to generate 1000 random samples of 'X' from an equation suppose ((Y=mX+c) with Monte Carlo simulation.

Best Answer

I don't see why this is difficult. (Actually, I do, but there are ways to resolve the problem.) I think your difficulty lies in one of definition.
Fact: A line has infinite extent. (Agreed? I hope so.) It goes from -inf to +inf at one end or the other.
Fact: You cannot randomly sample from any set without specifying what is the distribution for that sampling. (This is something that many people don't seem to appreciate. Sorry, but accept it.) Typically, when people ask this question, they think they can use some default distribution. Is uniform good for you?
Oops. The problem here is it makes no sense to define a uniform distribution over an infinite interval. So you cannot uniformly sample x over the interval (-inf,inf). Sorry, but undefined.
One solution is to choose some distribution that is defined over an infinite domain. Gaussian, exponential, double exponential etc. So generate X randomly, then compute Y=M*X+c.
The second simple solution is to pick some finite interval for X, generate X uniformly on that interval (READ THE HELP FOR RAND) then again compute Y=M*X+c.
Again, so whats the problem? Simple, easy to do, as long as you try to solve a well-defined problem. Try to solve a poorly defined problem, and you will have problems.