In general, if you can show that your optimization problem has continuous and differentiable objective function and constraint functions and satisfies a special technical hypothesis called a constraint qualification (there are many constraint qualifications in the literature) then the Lagrange multiplier condition is a necessary condition for a point $x^{*}$ to be a local minimum (or maximum.)
In other words, if a feasible point $x^{*}$ is a local minimum (maximum), and a constraint qualification holds, then the Lagrange multiplier condition must hold at $x^{*}$.
In this situation, if you can identify all of the points that satisfy the Lagrange multiplier condition then you can simply pick the one with the smallest objective value as the global minimum.
Unfortunately, without a constraint qualification hypothesis, it could be possible that a point is an extreme point (and perhaps a global maximum or minimum) even though it doesn't satisfy the Lagrange multiplier condition! Most discussions of Lagrange multipliers in calculus textbooks gloss over this important technical detail.
One commonly used constraint qualification is the linear independence constraint qualification (LICQ), "the gradients of the constraint functions at $x^{*}$ are linearly independent." The LICQ is satisfied at nearly every feasible point in your problem. You can easily finish this off by taking care of the exception(s).
It's a good exercise to setup a counter example optimization problem in which a point $x^{*}$ is a global minimum, the gradients of the constraints are linearly dependent, and you can't solve for the Lagrange multipliers.
You can find proofs of the Lagrange multiplier necessary condition with the LICQ hypothesis in many textbooks on optimization. More advanced books will include proofs with other constraint qualifications and explore the implications between different constraint qualifications.
Proof for $p ≥ 1$
Since $u^p - 1 ≥ p(u - 1)$ for all $u ≥ 0$, it suffices to prove the result for $p = 1$. That follows from
$$\frac{x^y}{y} - 1 ≥ \frac{1 + y \ln x}{y} - 1
= \ln x + \frac1y - 1 ≥ \ln x + \ln \frac1y = \ln x - \ln y$$
by cyclic summation over $(x, y) = (x_i, x_{i + 1})$.
Conjectured proof for $p ≥ \frac12$
Since $u^p - 1 ≥ 2p(u^{\frac12} - 1)$ for all $u ≥ 0$, it suffices to prove the result for $p = \frac12$. Numerical evidence suggests that
$$\left(\frac{x^y}{y}\right)^{\frac12} - 1 ≥ \frac{\ln x}{2\sqrt[4]{1 + \frac13 \ln^2 x}} - \frac{\ln y}{2\sqrt[4]{1 + \frac13 \ln^2 y}}$$
for all $x, y > 0$. If this is true, cyclic summation yields the desired result.
Counterexample for $0 < p < \frac12$
Let $g(x) = \left(\frac{x^{1/x}}{1/x}\right)^p + \left(\frac{(1/x)^x}{x}\right)^p$. Then $g(1) = 2$, $g'(1) = 0$, and $g''(1) = 4p(2p - 1) < 0$, so we have $g(x) < 2$ for $x$ in some neighborhood of $1$. This yields counterexamples for all even $n$:
$$\left(x, \frac1x, x, \frac1x, \dotsc, x, \frac1x\right), \quad x ≈ 1, \quad 0 < p < \frac12.$$
For $n = 3$, the best counterexample seems to be
$$(0.41398215, 0.73186577, 4.77292996), \quad 0 < p < 0.39158477.$$
Best Answer
I came up with a proof in Jan 10, 2023.
We have \begin{align*} &(a^3 + b^3 + c^3)^2 - 3(a^4 + b^4 + c^4)\\ ={}& a^6 + b^6 + c^6 + 2a^3b^3 + 2b^3c^3 + 2c^3a^3 - 3(a^4 + b^4 + c^4)\\ \ge{}& a^6 + b^6 + c^6 + (3a^2b^2 - 1) + (3b^2c^2 - 1) + (3c^2a^2 - 1) - 3(a^4 + b^4 + c^4) \tag{1}\\ ={}& a^6 + b^6 + c^6 + 3a^2b^2 + 3b^2c^2 + 3c^2a^2 - 3a^4 - 3b^4 - 3c^4 - 3\\ ={}& a^6 + b^6 + c^6 + 3a^2b^2 + 3b^2c^2 + 3c^2a^2 - 3a^4 - 3b^4 - 3c^4 - 3a^2b^2c^2\\ ={}& (a^2 + b^2 + c^2 - 3)(a^4 + b^4 + c^4 -a^2b^2 - b^2c^2 - c^2a^2)\\ \ge{}& 0 \end{align*} where we use $2a^3b^3 - (3a^2b^2 - 1) = (2ab + 1)(ab - 1)^2 \ge 0$ in (1), and $a^2 + b^2 + c^2 \ge 3$ by AM-GM, and $a^4 + b^4 + c^4 -a^2b^2 - b^2c^2 - c^2a^2 \ge 0$ (easy).
We are done.