# Solved – Model fitting when errors take a Cauchy distribution

cauchy distributionmodeling

It's my understanding that the sum of squared errors (SSE) serves as a maximum likelihood estimator when a model's errors are normally distributed. (That is, if you find model parameters that minimize the SSE, they also maximize the likelihood.) However, the error distribution of my model looks much more like a Cauchy distribution. Would minimizing the SSE still result in the maximum likelihood parameter set for my model? If not, what statistic should I look at?

Forgive me if this doesn't make any sense, or I'm missing something simple. Please feel free to link to sources that might help me understand the basics a bit better. Thanks!

If you really wanted maximum likelihood estimates for regression parameters with Cauchy errors, just look at that likelihood: $$L(\beta,\sigma)=\prod_{i=1}^n {\frac{1}{\pi\sigma\left(1+\left(\frac{y_i-\beta^\mathrm{T}x_i}{\sigma}\right)^2\right)}}$$
($y_i$ is the $i$th observation, $x_i$ the vector of predictors, $\sigma$ the scale parameter, & $\beta$ the vector of coefficients.) There's no sufficient statistic of lower dimensionality than the entire dataset, so it's not so easy to maximize, though there's probably a better method than brute force. But without some theoretical motivation for assuming Cauchy errors, you can just say they have some fat-tailed distribution. In this situation some form or other of robust regression would be worth considering.