Solved – the computational cost of gradient descent vs linear regression

gradient descentmultiple regressionoptimizationtime complexity

I know the computational costs for the closed form of linear regression is $O(n^3)$, but I can't find a similar cost comparison to gradient descent.

There are some similar questions here with people "talk" about how gradient descent is more efficient and present some formulas that are not in the form of $O(\cdot)$ and do not include where they got their information.

So to reiterate, I am looking for the computational complexity for gradient descent in the form of $O(\cdot)$, something where $O(\cdot) < O(n^3)$.

It's possible I'm thinking about this wrong and there is no big $O$ comparison. If so please let me know. Thank you.

Best Answer

The computational cost of gradient descent depends on the number of iterations it takes to converge. But according to the Machine Learning course by Stanford University, the complexity of gradient descent is $O(kn^2)$, so when $n$ is very large is recommended to use gradient descent instead of the closed form of linear regression.

source: https://www.coursera.org/learn/machine-learning/supplement/bjjZW/normal-equation