I happen to have some notes on this question. What follows here is the usual approach, it's just multivariate calculus paired with the Cauchy Riemann equations. I have an idea for an easier way, I'll post it as a second answer in a bit if it works.
If we use polar coordinates to rewrite $f$ as follows:
$$ f(x(r,\theta),y(r,\theta)) = u(x(r,\theta),y(r,\theta))+iv(x(r,\theta),y(r,\theta)) $$
we use shorthands $F(r,\theta)=f(x(r,\theta),y(r,\theta))$ and $U(r,\theta )=u(x(r,\theta),y(r,\theta))$ and $V(r,\theta )=v(x(r,\theta),y(r,\theta))$. We derive the CR-equations in polar coordinates via the chain rule from multivariate calculus,
$$ U_r = x_ru_x + y_ru_y = \cos(\theta)u_x + \sin(\theta)u_y \ \
\text{and} \ \ U_{\theta} = x_{\theta}u_x + y_{\theta}u_y = -r\sin(\theta)u_x + r\cos(\theta)u_y $$
Likewise,
$$ V_r = x_rv_x + y_rv_y = \cos(\theta)v_x + \sin(\theta)v_y \ \
\text{and} \ \ V_{\theta} = x_{\theta}v_x + y_{\theta}v_y = -r\sin(\theta)v_x + r\cos(\theta)v_y $$
We can write these in matrix notation as follows:
$$ \left[ \begin{array}{l} U_r \\ U_{\theta} \end{array} \right] = \left[ \begin{array}{ll} \cos(\theta) & \sin(\theta) \\ -r\sin(\theta) & r\cos(\theta) \end{array} \right]\left[ \begin{array}{l} u_x \\ u_y \end{array} \right] \ \ \text{and} \ \
\left[ \begin{array}{l} V_r \\ V_{\theta} \end{array} \right] = \left[ \begin{array}{ll} \cos(\theta) & \sin(\theta) \\ -r\sin(\theta) & r\cos(\theta) \end{array} \right]\left[ \begin{array}{l} v_x \\ v_y \end{array} \right] $$
Multiply these by the inverse matrix: $\left[ \begin{array}{ll} \cos(\theta) & \sin(\theta) \\ -r\sin(\theta) & r\cos(\theta) \end{array} \right]^{-1} = \frac{1}{r}\left[ \begin{array}{ll} r\cos(\theta) & -\sin(\theta) \\ r\sin(\theta) & \cos(\theta) \end{array} \right]$ to find
$$ \left[ \begin{array}{l} u_x \\ u_y \end{array} \right] = \frac{1}{r}\left[ \begin{array}{ll} r\cos(\theta) & -\sin(\theta) \\ r\sin(\theta) & \cos(\theta) \end{array} \right]\left[ \begin{array}{l} U_r \\ U_{\theta} \end{array} \right] = \left[ \begin{array}{l} \cos(\theta)U_r - \tfrac{1}{r}\sin(\theta)U_{\theta} \\
\sin(\theta)U_r + \tfrac{1}{r}\cos(\theta)U_{\theta} \end{array} \right] $$
A similar calculation holds for $V$. To summarize:
$$ u_x = \cos(\theta)U_r - \tfrac{1}{r}\sin(\theta)U_{\theta} \ \ \ \ v_x = \cos(\theta)V_r - \tfrac{1}{r}\sin(\theta)V_{\theta} $$
$$ u_y =\sin(\theta)U_r + \tfrac{1}{r}\cos(\theta)U_{\theta} \ \ \ \ v_y =\sin(\theta)V_r + \tfrac{1}{r}\cos(\theta)V_{\theta} $$
The CR-equation $u_x=v_y$ yields:
$$ (A.) \ \ \cos(\theta)U_r - \tfrac{1}{r}\sin(\theta)U_{\theta} = \sin(\theta)V_r + \tfrac{1}{r}\cos(\theta)V_{\theta} $$
Likewise the CR-equation $u_y=-v_x$ yields:
$$ (B.) \ \ \sin(\theta)U_r + \tfrac{1}{r}\cos(\theta)U_{\theta} = -\cos(\theta)V_r + \tfrac{1}{r}\sin(\theta)V_{\theta}$$
Multiply (A.) by $r\sin(\theta)$ and $(B.)$ by $r\cos(\theta)$ and subtract (A.) from (B.):
$$ \boxed{U_{\theta} = -rV_r} $$
Likewise multiply (A.) by $r\cos(\theta)$ and $(B.)$ by $r\sin(\theta)$ and add (A.) and (B.):
$$ \boxed{rU_r = V_{\theta}} $$
Finally, recall that $z = re^{i\theta}=r(\cos(\theta)+i\sin(\theta))$ hence
\begin{align} \notag
f'(z) &= u_x+iv_x \\ \notag
&= (\cos(\theta)U_r - \tfrac{1}{r}\sin(\theta)U_{\theta})+i(\cos(\theta)V_r - \tfrac{1}{r}\sin(\theta)V_{\theta}) \\ \notag
&= (\cos(\theta)U_r + \sin(\theta)V_{r})+i(\cos(\theta)V_r - \sin(\theta)U_{r}) \\ \notag &= (\cos(\theta)- i\sin(\theta))U_r + i(\cos(\theta)-i\sin(\theta))V_r \\ \notag
&= e^{-i\theta}( U_r+iV_r) \notag
\end{align}
this is a little late, and so I’m sure you have come across an answer by now! But I shall write the below nonetheless, as it may help others. Firstly we need to realize what the Cauchy-Riemann equations tell us. If you derive them (you can look this derivation up it comes from the definition of the derivative actually), you will see that they come from the assumption that a complex function is differentiable at the point of interest. So, if we are differentiable, then we satisfy Cauchy-Riemann. Which also tells us that the contrapositive is true, so that the statement “if we do not satisfy Cauchy-Riemann, then we are not differentiable” is true. Now onto analyticity, for a function to be analytic at the point P in the complex plane, it must be differentiable in a neighborhood of P. So Cauchy-Riemann must be satisfied in that neighborhood of P. It is a stricter condition, as we can be differentiable at a point but not necessarily analytic/holomorphic if we are not differentiable in the neighborhood.
So now let’s analyze your question. You have asked “can a function be analytic if it doesn’t satisfy CR?” Let’s do a proof of falsity by contradiction: assume a function can be analytic and not satisfy CR at the point P (I’m making your question slightly more specific). This would mean that we are differentiable in a neighborhood of a point P but do not satisfy CR at the point P. If we do not satisfy CR at the point P, then we are not differentiable at P (as per the contrapositive statement above). But that would mean that we are not differentiable in a neighborhood of P since the neighborhood of P includes itself! So we arrive at our contradiction, that we started as differentiable and became non-differentiable in a neighborhood of P which contradict one another, so we abandon the assumption that we are analytic. Thus, if we do not satisfy CR at a point P, then we cannot be analytic at the point P.
Now suppose that your question became “can we be analytic if we do not satisfy Cauchy-Riemann at some point or set of points in the neighborhood of P?” Observe that this is as general as we can make your question! The proof to this is just the same as above, assume you can be analytic at P and do not satisfy CR at some point or set of points in a neighborhood of P, then we are not differentiable at some point or set of points in a neighborhood of P, so we are not differentiable in the neighborhood of P, and so we are lead to the same contradiction as above. To be specific, it is the contradiction that we started as differentiable but had information (CR not satisfied) to show that we are not differentiable and so we are lead to the contradiction and therefore must abandon the assumption.
So we can never be analytic at a point P without satisfying the Cauchy-Riemann equations in a neighborhood of P. This is the ultimate punch line of the demonstration above. I hope things have been made more clear from this, it is good practice in mathematics to try and prove claims that you make or potential answers to questions you have, it allows you to practice what you’ve learned as well and maybe even reinforce your knowledge or learn something new!
Have a great day.
Best Answer
Well, if a function $f$ is holomorphic (i.e. $\mathbb{C}$-differentiable), it is real analytic, then it has continuous partial derivatives. Therefore, checking the continuity of partial derivatives and then the CR-equations is, in some sense, a perfectly general method in order to determine whether a function is holomorphic or not.
By the way, it is absolutely true that satisfying the CR-equations is not enough: set $f(z)=e^{-z^{-4}}$ for $z\neq0$ and $f(0)=0$. Then $f$ has partial derivatives everywhere, also in the origin, and they satisfy the CR-equations everywhere, but $f$ is not $\mathbb{C}-$differentiable in $0$. Indeed, the partial derivatives are not continuous in $0$.
On a more operative side, there are many other ways to determine if a function is holomorphic or not. For example, if the partial derivatives exist (in the weak sense) and are locally integrable and satisfy the CR-equations, or if it is bounded and holomorphic outside a small enough set. Sums, products, compositions of holomorphic functions are holomorphic, and so on...