There is one lower bound on minimum eigenvalue of symmetric p.d. matrix given in [Applied Math. Sc., vol. 4, no. 64] which is based on Frobenius norm (F) and Euclidean norm (E)
$$ \lambda_{min} \gt \sqrt{\frac{||A||_F^2-n||A||_E^2}{n(1-||A||_E^2/|det(A)|^{2/n})}} $$
if it helps.
[reference]: K. H. Schindler, "A New Lower Bound for the Minimal Singular Value for Real Non-Singular Matrices by a Matrix Norm and Determinant", Journal of Applied Mathematical Sciences, Vol. 4, No. 64, 2010.
Let us denote by $r$ the Perron-root of the positive matrix $A \in \mathbb{R}^{n \times n}$. Then by the Collatz-Wielandt formula we have:
$$\max_{x \in S}\min_{\substack{i=1, \ldots,n}} \frac{(Ax)_i}{x_i} = r = \min_{x \in S}\max_{\substack{i=1, \ldots,n}} \frac{(Ax)_i}{x_i}, $$
where $S := \{x \in \mathbb{R}^n\setminus\{0\}: x_i > 0, \forall i=1,\ldots,n\}$. Now it is clear that $A$ and $A^T$ have same eigenvalues, since $\det(M)=\det(M^T)$ and for every $\lambda \in \mathbb{R}$ we have $$\det(A-\lambda I)=\det((A-\lambda I)^T)= \det(A^T-\lambda I).$$
Furthermore $A$ strictly positive implies $A^T$ strictly positive, thus this formula also holds for $A^T$. It follows that we have
$$\max_{x \in S}\min_{\substack{i=1, \ldots,n}} \frac{(A^Tx)_i}{x_i} = r = \min_{x \in S}\max_{\substack{i=1, \ldots,n}} \frac{(A^Tx)_i}{x_i}.$$
This clearly implies that for every $y \in S$ we have
$$\min_{\substack{i=1, \ldots,n}} \frac{(A^Ty)_i}{y_i} \leq r \leq \max_{\substack{i=1, \ldots,n}} \frac{(A^Ty)_i}{y_i}.$$
Choose $y = (1,1,\ldots,1)$ to get your bounds. Note also that using the same trick on $A$ directly you will get the same upper/lower bound but with the columns instead of the rows.
For reference, I recommend (in increasing order of technicality/generality):
- "Matrix analysis" by Horn and Johnson
- "Nonnegative Matrices in the Mathematical Sciences" by Bermann and Plemmons
- "Nonlinear Perron-Frobenius theory" by Nussbaum and Lemmens
There are even more general versions discussed in recent literature, e.g. in this paper
Best Answer
Frobenius norm is the same as Euclidean norm and their squares is the sum of the squares of matrix entries. Therefore the bound you stated is wrong.
I think the bound you have encountered has the square of the maximum eigenvalue in the numerator instead of the Euclidean norm (Schindler's publication):http://library.utia.cas.cz/separaty/2009/AS/schindler-tikhonov%20regularization%20parameter%20in%20reproducing%20kernel%20hilbert%20spaces%20with%20respect%20to%20the%20sensitivity%20of%20the%20solution.pdf
But this bound is also wrong and should be corrected by the author. If you look into the proof you will notice preliminary mistakes.
If you're looking for a lower bound in terms of trace and determinant there are some publications available that you can trust. I can recommend this one for instance: https://www.researchgate.net/publication/242985986_Bounds_for_eigenvalues_using_the_trace_and_determinant