Prove $f(x,y) = \frac{x}{y^2}$ is not a convex function on the set $\{(x,y)\in\mathbb{R}^2 : x>0, y>0 \}$ without using Hessian

convex optimizationconvex-analysisreal-analysis

Summary

Known: $f(x,y) = \frac{x}{y^2}$ is not a convex function on the set $\{(x,y)\in\mathbb{R}^2 : x>0, y>0 \}$. This can be proved by the Hessian of the function.

Questions: How can we prove the local non-convexity without using the second-order condition?

$f(x,y)$ being a non-convex function on the set $\{(x,y)\in\mathbb{R}^2 : x>0, y>0 \}$

By the Hessian of the function:
$$ \nabla^2 f(x,y) = \begin{bmatrix}
0 & -\frac{2}{y^3} \\ -\frac{2}{y^3} & \frac{6x}{y^4}
\end{bmatrix}$$

On the set $\{(x,y)\in\mathbb{R}^2: x>0, y>0 \}$, $\nabla^2f(x,y)$ is a sign-indefinite matrix. By the second-order conditions, we conclude that the function $f(x,y)$ is locally non-convex on the set $\{(x,y)\in\mathbb{R}^2: x>0, y>0 \}$.

Attempt:

I try to show the non-convexity by the basic definition,i.e., $\exists (x,y)\in\mathbb{R}^2, x>0, y>0, \theta\in[0,1] $ such that $$ f(\theta x_1 + (1-\theta)x_2) \geq \theta f(x_1) + (1-\theta) f(x_2)$$

Hence, try to verify the following is negative for some $x_1,x_2,\theta$:
$$ \theta f(x_1,y_1) + (1-\theta) f(x_2,y_2) – f(\theta x_1 + (1-\theta) x_2, \theta y_1 + (1-\theta) y_2) = \theta \frac{x_1}{y_1^2} + (1-\theta)\frac{x_2}{y_2^2} – \frac{\theta x_1 + (1-\theta)x_2}{(\theta y_1 + (1-\theta)y_2)^2}$$
However, I lost the direction and don't know how to proceed from here. I presume I'll need some inequality to do so but couldn't think of a proper one.

Questions

How to prove the local non-convexity of $f(x,y)$ by the convexity definition or other approach than the second-order condition?


Motivation of the problem

I'm trying to prove the convexity of a parametric constrained optimization problem, which is a semidefinite program for a given parameter. I manage to reformulate my constrained into linear matrix inequalities, which are convex. However my objective function become nonlinear as the following:
$$ \frac{\lambda_{max}(P)}{y^2},$$
where $P$ and $y$ are decision variables and $\lambda_{max}(\cdot)$ is the largest eigenvalue of the matrix $P$. The constraints include $P$ being a positive definite matrix and $y$ being positive. If this function is a locally convex function, we can conclude the optimal value function in the parametric optimization problem is convex in its parameter by the paper from Fiacco and Kyparisis.

Before tackling the exact problem, I attempt to solve the similar but less complex problem I asked above to find possible approaches for my exact problem.

Best Answer

Let $U\subset \Bbb R^d$ be open and let $f\colon U\to\Bbb R$.

Let's analyse the notion of the convexity. Observe that the condition $$f(\theta x_1 + (1-\theta)x_2) \leq \theta f(x_1) + (1-\theta) f(x_2)$$ is one-dimensional in it's nature. We consider any segment $I=[x_1,x_2]\subset U$ and $f|I$ has to be convex.

In other words, for any $a\in \Bbb R^d$ (we can take $a$ only from $U$) and $ v\in \Bbb R^d$ we define the function $f_{a,v}(t)=f(a+tv)$ for such $t\in\Bbb R$ that $a+tv\in U$. Now we can say that $f$ is convex iff $f_{a,v}$ is convex for any $a\in U$ and $ v\in \Bbb R^d$.

Now we are ready to analyse the situation from the question.

In order to show that some function $f$ isn't convex it sufficies to find some line or segment lying on the domain such that the function $f$ restricted to that line or segment isn't convex. Having a point $p=(a,b)\in U:=\{x,y>0\}$ and $v=(\cos\alpha,\sin\alpha)$ we can consider $$f_{p,v}(t)=f(a+tv_1,b+tv_2)=\frac{a+t\cos\alpha}{(b+t\sin\alpha)^2}.$$

Now it's easier to analyse $f$ by analysing $f_{p,v}$ near $0$. We can draw it's graph, we can calculate the derivative. I checked the graph of $f_{p,v}$ for $p=(1,1)$ and I saw that for small positive $\alpha$ the function $f_{p,v}$ is concave near $0$. I didn't try to prove it analytically.