Least Squares – How to Use OLS or NLS on Very Small Sample Sizes

bootstrapleast squaressmall-sample

right now I am facing the problem of handling a very small sample size which consists of 8 observations with 24 variables.

I would like to do first just simple OLS with one dependent and 23 independent variables and use these coefficients to estimate NLS estimates. These estimates will be used for further studies.

The problem obviously is that I got more parameters than observations. I tried to bootstrap my observations and just do simple OLS with a handful of parameter with bootstrap repetitions between 50 and 50,000 to improve my results, however, in fact, the significance of some coefficients even decreased.

Is it even possible to derive somewhat significant coefficients here? What else can I do? I am new here, so I dont know yet how to include a table with exemplary data. Sorry.

Thanks for any help,
TheJoez

Best Answer

Your problem is not new and a whole chapter ("High-Dimensional Problems") of this book is dedicated to such cases where the number of variables $p$ is much bigger than the number of observations $N$. Numerous ways are possible.

In my opinion, the two simplest methods to regularize the problem are the Lasso and the Ridge Regression which consist in adding a penalty to the standard least-square term.

Related Question