[Math] Limit as N goes to Infinity

calculuslimitssequences-and-series

Consider this limit: $$\lim_{n\rightarrow\infty} \left( 1+\frac{1}{n} \right) ^{n^2} = x$$

I thought the way to solve this for $x$ was to reduce it using the fact that as $n \rightarrow \infty$, $\frac{1}{n} \rightarrow 0$:
$$\therefore \lim_{n\rightarrow\infty}(1+0)^{n^2} = x$$

Apparently this is wrong! Why is it wrong?

Best Answer

But as $n\rightarrow \infty $ that exponent gets ever larger. A smaller argument, yes, but raised to an ever larger power. Can't assume that the whole thing is going to go to one. Try a few examples on your calculator to see what happens.

For a little more detail, if $n$ is some big number, when you multiply everything out, the first two terms (look up the binomial expansion) will be

$1+ n^2 * \displaystyle\frac{1}{n} = 1+n \rightarrow \infty$

None of the remaining terms will be negative, so you know this is a bottom limit on what you are going to get.