The inequality is true.
Lemma Let $\lambda_1 \leq \ldots \leq \lambda_N$ be an increasing sequence of real numbers. Consider the set $\mathcal{A}_k := \{ (x_1,\ldots,x_N) \in [0,1]^N, \sum x_i = k \leq N\}$ of $N$-tuples inside the unit $N$ box, with total sum given by some real number $k$. Then
$$ \inf_{x\in \mathcal{A}_k} \sum_{i = 1}^N \lambda_i x_i = \sum_{i = i}^{\lfloor k \rfloor} \lambda_i + (k-\lfloor k\rfloor) \lambda_{\lceil k\rceil} $$
In other words: the lemma says that if you have $N$ different materials costing $\lambda_1\ldots \lambda_N$ per kilogram. You need to buy $k$ kilograms, but you are not allow to buy more than 1 kilogram of each type, then the cheapest way you can satisfy the requirement is to first buy 1 kilo of the cheapest stuff, then another kilo of the second cheapest, and so on until you used up all the money. The proof of the Lemma I'll leave as an easy exercise.
Now we can apply Lubos' observation. Namely that if $X_1, X_2, \ldots X_M$ are $M$ orthonormal vectors in $\mathbb{R}^N$ (with $M < N$), then
$$ P = X_1X_1^T + X_2X_2^T + \cdots X_MX_M^T $$
is a projection operator onto some $M$ dimensional subspace. In particular, as a projection, it cannot increase in length! This implies that for each unit vector $v$
$$ v^TPv \leq 1$$
and in particular if we take $v$ to be the standard unit vector $e_i$, this says that the coordinate values
$$ \sum_{j = 1}^M (e_i^T X_j)^2 \leq 1$$
On the other hand, we also have the fact that for each fixed vector $X_j$, the normality condition
$$ \sum_{i = 1}^N (e_i^T X_j)^2 = 1 $$
So in particular if we define the $N$-tuple
$$ Y_i = \sum_{j=1}^M (e_i^TX_j)^2 $$
we have that $\sum_{i=1}^N Y_i = M$, and each individual component $Y_i\leq 1$. So in particular $Y_i \in \mathcal{A}_M$.
Your inequality then follows from the above observation combined with the Lemma.
Note that this also shows why in the case of $N = M$ the inequality must be saturated. The above process converts the problem to the case of the Lemma where you have $N$ types of material, you need to buy $N$ kilograms with no more than 1 kilo of each. So there's only one way of filling the purchase order!
This question is bugging me since, as far as I can see, the diagram illustrates
that if we define $\gamma$ by
$$\sum_{i=1}^n \lambda_i a_i = \gamma {a_1}+(1-\gamma){a_n},$$
then
$${1\over \sum_{i=1}^n \lambda_i a_i}\leq \gamma {1\over a_1}+(1-\gamma){1\over a_n},$$
which is true, but not at all useful.
We want to show that
$$\beta=\left(\sum_{i=1}^n \lambda_i a_i\right)\left(\sum_{i=1}^n \lambda_i/a_i\right)$$
is maximized when all the weight is on $a_1$ and $a_n$. We can prove this directly using
induction, by finding a larger value of $\beta$ using fewer points.
If $n>2$, choose any $1<j<n$. Fix everything except $a_j$; all the weights $\lambda_i$ and
all the other points $a_1, \dots, a_{j-1},a_{j+1}, \dots, a_n$.
Let $x$ be a variable that can range freely between $a_{j-1}$ and $a_{j+1}$.
Define $$\beta(x)=\left(\sum_{i\neq j} \lambda_i a_i+\lambda_j x\right)\left(\sum_{i\neq j} \lambda_i/a_i+\lambda_j/x\right).$$
Its second derivative is
$$\beta^{\prime\prime}(x)=\left(\sum_{i\neq j} \lambda_i a_i\right) {2\lambda_j\over x^3}\geq 0.$$
This means that $\beta(x)$ is convex, and so its maximum occurs at one of the endpoints.
In other words, $\beta\leq\max\{\beta(a_{j-1}),\beta(a_{j+1})\}$ so we can increase $\beta$
by moving $a_j$ to either $a_{j-1}$ or $a_{j+1}$. This completes the induction step, and gives
the desired result.
I expect that there is a cleverer, more direct proof but I couldn't find it.
Best Answer
Just one remark :
We have with the OP's constraints :
$$\left(\sum\limits_{j=1}^n \lambda_j x_j \right)\left(\sum\limits_{j=1}^n \lambda_j x_j^{-1} \right)\leq \left(\sum\limits_{j=1}^n \frac{1}{n} x_j \right)\left(\sum\limits_{j=1}^n \frac{1}{n} x_j^{-1} \right)\leq \dfrac{(x_1+x_n)^2}{4x_1x_n}.$$
With $\lambda_n\geq\cdots\geq\lambda_1$ Can you end now ?
A sketch for the LHS :
We start by prove the case $n=2$
The case $n=2$ have the form :
$$(\lambda_1a+(1-\lambda_1)b)(\lambda_2a+(1-\lambda_2)b)$$
Where $a+b=1$ .Remains to take the logarithm and use Jensen's inequality .
Now the case $n=3$
We have with $x_1\geq x_2\geq x_3>0$ and $\lambda_1\geq\lambda_2\geq\lambda_3>0$ and $\lambda_1+\lambda_2+\lambda_3=1$
$$\frac{4}{9}\frac{(\lambda_1x_1+(\lambda_2+\lambda_3)(x_2+x_3))(\frac{\lambda_1}{x_1}+\frac{\lambda_2+\lambda_3}{x_2+x_3})}{(x_1+x_2+x_3)(\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3})}\geq \frac{(\lambda_1x_1+\lambda_2x_2+\lambda_3x_3))(\frac{\lambda_1}{x_1}+\frac{\lambda_2}{x_2}+\frac{\lambda_3}{x_3})}{(x_1+x_2+x_3)(\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3})} $$
And :
$$\frac{(\lambda_1x_1+\frac{(1-\lambda_1)}{2}(x_2+x_3))(\frac{\lambda_1}{x_1}+\frac{(1-\lambda_1)}{2}\left(\frac{1}{x_2}+\frac{1}{x_3}\right))}{(x_1+x_2+x_3)(\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3})}\geq \frac{(\lambda_1x_1+\lambda_2x_2+\lambda_3x_3))(\frac{\lambda_1}{x_1}+\frac{\lambda_2}{x_2}+\frac{\lambda_3}{x_3})}{(x_1+x_2+x_3)(\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3})} $$
This method works in the general case .