Let $f(x)=a_nx^n+a_{n-1}x^{n-1}+\cdots+a_1x+a_0$. Let us assume that $a_n>0$ (the case in which $a_n<0$ is similar). Then\begin{align}\lim_{x\to\pm\infty}f(x)&=\lim_{x\to\pm\infty}a_nx^n\left(1+\frac{a_{n-1}}{a_nx}+\frac{a_{n-2}}{a_nx^2}+\cdots+\frac{a_0}{a_nx^n}\right)\\&=+\infty\times1\\&=+\infty.\end{align}Therefore, there is a $R>0$ such that $\lvert x\rvert>R\implies f(x)\geqslant f(0)$. So, consider the restriction of $f$ to $[-R,R]$. Since $f$ is continuous and since $[-R,R]$ is closed and bounded, $f|_{[-R,R]}$ attains a minimum at some point $x_0\in[-R,R]$ and, of course, $f(x_0)\leqslant f(0)$. Since outside $[-R,R]$ you always have $f(x)\geqslant f(0)$, $f$ attains its absolute minimum at $x_0$.
The functions you are talking about are called bump functions and are incredibly important in the theory of distributions.
Bump functions can have any close interval as their support; as an example, the function
$$f(x)=\begin{cases} e^{\frac{-1}{(x-a)^2(x-b)^2}}\ \ x\in [a,b]\\
0\end{cases}$$
is a smooth function with compact support $[a,b]$.
Actually, more is true: given any compact set $K$ and an open set $U$ containing $K$ there is a bump function that has value $1$ inside $K$ and $0$ outside $U$ (for the construction see the linked Wikipedia page).
- Derivatives and integrals
The derivative of a bump function is still a bump function vanishing outside the same set $K$
Proof:
Let $A=\mathbb{R}-K$. Then $A$ is an open set on which $f=0$. For every point $x$ in $A$ we have, for $h$ sufficiently small such that $x+h$ is still in $A$
$$f'(x)=\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}=\lim_{h\to 0}\frac{0}{h}=0$$
Integration is possible, but since it depends on a constant the integral is not assured to be $0$ outside $K$.
Yes. Every non zero bump function is smooth but is not analytic: this easily follows from a theorem known as the identity theorem, which states that two analytic functions defined on an open and connected set that
are equal on a set of points $S$ such that $S'≠0$ are equal on all the domain.
Thus, if a non zero bump function was to be analytic, it would have to be zero everywhere, which is not the case.
Best Answer
That can't happen. Consider $d(x):=f(x)-g(x)$. If it is identical zo zero, the two functions $f$ and $g$ are the same. So let's assume there is a non-zero vale of it:
$$d(x_0)=y_0 \neq 0$$.
Now since $f$ and $g$ are continous, there difference $d$ is as well. So there exists a $\delta_0 > 0$ such that $\lvert x - x_0 \rvert < \delta_0$ implies $\lvert d(x)-d(x_0) \rvert < \frac12 \lvert y_0 \rvert$, which implies $d(x) \neq 0$ (otherwise the LHS of the last inequalilty would be $\lvert 0 - d(x_0)\rvert = \lvert y_0 \rvert$). Because $d(x)$ is continuous, that means $d(x)$ can't change it's sign inside the interval $(x_0-\delta_0, x_0+\delta_0)$.
So $d(x)=f(x)-g(x)$ is either always positive or always negative in that interval, which is the same as saying that $f(x)$ is always bigger or always smaller than $g(x)$ in that interval.