This is a question from Calculus by Michael Spivak, how do start answering this question?
[Math] If $\lim f(x)$ and $\lim g(x)$ do not exist, can the $\lim [f(x)+g(x)]$ exist
calculuslimits
Related Solutions
I was writing this as a comment, but that you-are-out-of-characters alert was on my nerve, so here it is:
Basically I'm new to proofs so I don't know where to start, can you give me a hint on how >to start a question regarding limits?
That's okay. First, learn basics and keep them in your mind. For example, in this question it's necessary to know that if $x \to 0$, then $bx \to 0$ if $b$ is a real nonzero constant. After that, try to rephrase the problem and use the hints. In this case you just need to understand there's no difference between $x$ and $bx$ when $x$ goes to zero. That is, let $g(x) = \frac{f(x)}{x}$. Then what is $g(bx)$? Assume $\lim_{x\to 0} g(x)=l$. Then it is clear that you can change $x$ into $bx$ and get $\lim_{bx\to 0} g(bx)=l$ Note that we haven't used the phrase "$x\to 0$ implies $bx \to 0$" yet, we are just putting a different value in $g$. To understand this, as in my comment, let $bx = y$ and $\lim_{bx\to 0} g(bx)=l$ becomes $\lim_{y\to 0} g(y)=l$, which is the same as our first assumption ($\lim_{x\to 0} g(x)=l$).
So we have $\lim_{bx\to 0} g(bx)=l$. Now use "$x\to 0$ implies $bx \to 0$". Thus $\lim_{x\to 0} g(bx)=l$. So, using the definition of $g$, we have $\lim_{x\to 0} \frac{f(bx)}{bx}=l$, which means $\lim_{x\to 0} \frac{f(bx)}{x}=bl$.
For $b=0$, the fraction $\frac{f(bx)}{bx}$ is not defined, and so the limit doesn't exist.
For part (c), set $g(x)=\sin x$.
There are plenty of concepts from real analysis that aren't in Spivak's text. There are two main concrete directions to go, and a variety of abstract directions to go.
First there are function spaces. The main important concept that function spaces introduce is that there are different kinds of convergence which have different properties.
One of the most important examples is as follows. Suppose $f_n$ is a sequence of continuous functions on $[a,b]$. If they converge uniformly to $f$, i.e. $\lim_{n \to \infty} \sup_{x \in [a,b]} |f_n(x)-f(x)| = 0$, then $f$ is continuous. But if they converge pointwise, i.e. for every $x$ we have $\lim_{n \to \infty} f_n(x) = f(x)$, then $f$ may not be continuous. The classic counterexample is $f_n(x) = x^n$ on $[0,1]$.
Second there is Lebesgue integration. It turns out that the Riemann integration studied in calculus has bad analytic properties. A nice property that we very often want is that if $f_n \to f$ pointwise, then $\int f_n dx \to \int f dx$. Riemann integration makes it very difficult to guarantee this property. In particular it is very difficult to guarantee that the right hand side even makes sense. Lebesgue integration defines integration in such a way that the right hand side will always make sense, and gives a number of convenient criteria to guarantee that the convergence I described above will occur. As a bonus, a huge wealth of new functions become integrable, and a framework for integration over spaces other than $\mathbb{R}^n$ appears.
The abstract directions are numerous. However, they usually begin with metric spaces. I will let someone else discuss the relevance of this topic.
Best Answer
Yes the addition can definitely exist, just take the following two step functions:
$$ f(x) = \begin{cases} 1 & x \geq 0 \\ -1 & x < 0 \end{cases} \\ g(x) = \begin{cases} -1 & x \geq 0 \\ 1 & x < 0 \end{cases} \\ f(x) + g(x) = \begin{cases} 0 & x \geq 0 \\ 0 & x < 0 \end{cases} \rightarrow f(x) + g(x) = 0 $$
This is how you need to think. The same thing can happen when multiplying. Perhaps the first function is $0$ on the left and the second function is $0$ on the right. Then at the point of discontinuity they could both multiply (on each side) to give $0$ from both sides.
Note that you don't always have to make the resulting function equal to $0$ (just look at the above step functions, I could just shift both functions up by $1$ and then the addition would be $f(x) +g(x) = 2$ not $0$).