No, you cannot say that, since $\sqrt{x_2}^5$ is always non-negative. But you can say that $f(x_1,{x_1}^2)={x_1}^5$, which can take any real value.
EDIT
In the paper, they look for a set of probabilities $p_k\in (0, 1), k = 1, 2, \cdots, s$ and $p_{s+1}
= 1 - 2n \sum_{k=1}^s p_k \in [0, 1]$
such that condition (5) is satisfied, i.e. $p_{s+1}(A - B\sum_{k=1}^s \frac{1}{p_k}) > 1$.
(I put some images at the end.)
Condition (5) requires $A - B \sum_{k=1}^s \frac{1}{p_k} > 0$ and $p_{s+1} = 1 - 2n\sum_{k=1}^s p_k> 0$
which results in $\frac{A}{B} \cdot \frac{1}{2n}
> \sum_{k=1}^s \frac{1}{p_k} \cdot \sum_{k=1}^s p_k \ge s^2$,
or $A - 2ns^2 B > 0$.
As a result, in that paper, they solve the optimization problem
under the condition $A - 2ns^2 B > 0$ and $p_k\in (0, 1), k=1, 2, \cdots, s$ and $p_{s+1}\in (0, 1)$.
Under the condition $A - 2ns^2 B > 0$,
$p_1 = p_2 = \cdots = p_s = \sqrt{\frac{B}{2nA}}$
satisfy $1 -2 n \sum_{k=1}^s p_k > 0$ and hence the solution.
In the OP, $p_k$ is replaced with $x_i$, $s$ is replaced with $n$, $n$ is replaced with $k$. (I think the notation of the paper should be used.)
Using the notation of the OP, assuming that $A - 2kn^2 B > 0$, we can solve the optimization problem as follows.
With $x_i > 0, \forall i$ and $1-2k \sum_{i=1}^n x_i \ge 0$, we have
\begin{align}
\Big(A- B\sum_{i=1}^n \frac{1}{x_i}\Big)x_{n+1}
&= \Big(A- B\sum_{i=1}^n \frac{1}{x_i}\Big)\Big(1-2k \sum_{i=1}^n x_i\Big)\\
&\le \Big(A- B\frac{n^2}{\sum_{i=1}^n x_i}\Big)\Big(1-2k \sum_{i=1}^n x_i\Big) \tag{1}\\
&= A - B\frac{n^2}{\sum_{i=1}^n x_i}
- 2k A \sum_{i=1}^n x_i + 2kn^2B\\
&\le A - 2\sqrt{B\frac{n^2}{\sum_{i=1}^n x_i}\cdot 2k A \sum_{i=1}^n x_i} + 2kn^2 B \tag{2}\\
&= A - 2\sqrt{2kn^2 AB} + 2kn^2 B\\
&= (\sqrt{A} - n\sqrt{2kB})^2
\end{align}
with equality if and only if $x_1 = x_2 = \cdots = x_n = \sqrt{\frac{B}{2kA}}$,
and $p_{n+1} = 1 - 2kn\sqrt{\frac{B}{2kA}}$ (note: $1 - 2kn\sqrt{\frac{B}{2kA}} > 0$ since $A - 2kn^2 B > 0$).
Explanation: in (1), we have used Cauchy-Bunyakovsky-Schwarz inequality
to obtain $\sum_{i=1}^n \frac{1}{x_i} \ge \frac{n^2}{\sum_{i=1}^n x_i}$
with equality if and only if $x_1 = x_2 = \cdots = x_n$;
in (2), we have used $a + b \ge 2\sqrt{ab}$
with equality if and only if
$B\frac{n^2}{\sum_{i=1}^n x_i} = 2k A \sum_{i=1}^n x_i$
or $\sum_{i=1}^n x_i = \sqrt{\frac{Bn^2}{2kA}}$.
Some images from the paper:
image 1:
image 2:
image 3:
Best Answer
$$x_1x_2...x_m\neq\sqrt{x_1^2x_2^2...x_m^2}$$
The right way it's the following:
By AM-GM $$x_1x_2...x_m\leq\sqrt{x_1^2x_2^2...x_m^2}\leq\sqrt{\left(\frac{x_1^2+x_2^2+...+x_m^2}{m}\right)^m}=\frac{1}{m^{\frac{m}{2}}}$$ The equality occurs for $x_1=x_2=...=x_m=\frac{1}{\sqrt{m}},$ which says that we got a maximal value.