Using a bit of algebraic manipulation I get:
$$4^{\sqrt{n}} = (2^2)^{\sqrt{n}} = 2^{2 \sqrt{n}}$$
I think this makes it much easier to compare the functions. Intuitively the numerator in the first function overpowers the denominator (exponential vs subexponential), so we can think of it as just $2^n$ (for sufficiently large $n$, as in the Big O sense). $2 \sqrt{n}$ is much less than n for large n thus we conclude the first function grows faster than the second.
Normally when comparing functions asymptotically, the sufficiently large n could actually be quite large. In this case though, it is low. If you plug your equations and $2^n$ into Wolfram Alpha:
the plot shows $n \approx 10$ is the sufficiently large n. Notice in the plot how $2^n$ and your first function are almost the same with your's just slightly to the right. In fact, if you plotted the graph from $1$ to $20$, the second equation barely shows up.
If you want to talk about two functions' behavior as $x \to \infty$, there are several ways to compare them.
One might be to talk about the absolute error between them $|f(x)-g(x)|$ and ask if it tends to zero. Graphically this is easy to visualize in the kind of plots you are making: the vertical gap between the graphs of $f$ and $g$ shrinks to zero as $x \to \infty$.
This is a very strong notion of "$f$ and $g$ behave the same as $x \to \infty$," since it excludes examples such as $f(x)=x+1$ and $g(x)=x$.
There may be situations where you don't care about the absolute error but are more concerned with the relative error $\frac{f(x)}{g(x)}$. This is a more relaxed notion of "$f$ and $g$ behave the same as $x \to \infty$." Note that it holds for the above example $f(x)=x+1$ and $g(x)=x$.
The intuition is that if $f$ and $g$ both get large as $x \to \infty$, we are more and more willing to tolerate larger absolute errors between them, as long as these absolute errors are small relative to the value of the functions. Explicitly, note that $|\frac{f(x)}{g(x)}-1| = \frac{|f(x)-g(x)|}{|g(x)|}$. It is ok for the absolute value $|f(x)-g(x)|$ to not shrink to zero, as long as it is small relative to $|g(x)|$. Note also that we have the implication $|f(x)-g(x)| \to 0 \implies |\frac{f(x)}{g(x)} - 1| \to 0$.
Unfortunately, I don't think it is easy to determine this from plots of $f$ and $g$. For instance my example of $f(x)=x+1$ and $g(x)=x$ would be two parallel lines, which you might say becomes "almost identical" if you zoom out far enough, but you can come up with examples where a plot (which necessarily can't take $x \to \infty$) seems to show that $f$ and $g$ are "nearly identical," but don't actually satisfy $\frac{f}{g} \to 1$.
As a practical example of why relative error can be useful, suppose $f$ and $g$ are the number of minutes two algorithms need to process $x$ inputs. If $f(x)=x+1$ and $g(x)=x$ they are both basically requiring one minute per input, but the first algorithm needs an extra minute, maybe for set-up time. The extra minute that the first algorithm requires is not really a big deal when $x$ is large: for example if there are a million inputs, we are not really concerned about the difference between $10^6$ minutes and $10^6 + 1$ minutes.
Big-O notation is an even further relaxation of this notion.
Best Answer
I'd rather write $$2^{\sqrt{\log_2n}}/\sqrt n=2^{\sqrt{\log_2n}-(1/2)\log_2n}$$ and then note $$\sqrt{\log_2n}-(1/2)\log_2n\to-\infty$$ as $n\to\infty$ so the limit is, indeed, zero. This doesn't prove $2^{\sqrt{\log_2n}}\lt\sqrt n$ for all $n$, but certainly implies that inequality for all sufficiently large $n$.