[Math] Question about limit points in relation with continuity and functional limits

continuitylimitsreal-analysisself-learningsequences-and-series

I'm self-studying from the book Understanding Analysis by Stephen Abbott, and I have the feeling that the author is being careless about limit points in his theorems or I am not understanding something (probably the latter). For instance, for the following theorem:

(Algebraic Continuity Theorem). Assume $f: A \to \mathbb{R}$ and $g: A \to \mathbb{R}$ are continuous at a point $c \in A$. Then:

  1. $kf(x)$ is continuous at $c$ for all $k \in \mathbb{R}$;
  2. $f(x) + g(x)$ is continuous at $c$;
  3. $f(x)g(x)$ is continuous at $c$; and
  4. $f(x)/g(x)$ is continuous at $c$, provided the quotient is defined.

Proof. All of these statements can be quickly derived from Theorem$^1$ and
Theorem$^2$ (see below for the two theorems).

The only way I am able to prove the above theorem is by using theorem$^2$ part 2. However, part 2 is only true if the point $c \in A$ is a limit point. Since, for the above theorem, this is not a restriction on $c$, I don't understand how to prove it. Any help is much appreciated.

As an extra question: as far as I understand part 1 and part 3 of theorem$^2$ hold also true for continuous functions at $c$ if $c$ is not a limit point, right? What about part 4 of theorem$^2$? Does $c$ is need to be a limit point, or is this not necessary?


$^1$ (Algebraic Limit Theorem for Functional Limits). Let $f$ and $g$ be functions defined on a domain $A \subseteq \mathbb{R}$, and assume $\lim_{x\to c} f(x) = L$ and $\lim_{x \to c}g(x) = M$ for some limit point $c$ of $A$. Then:

  1. $ \lim_{x \to c} kf(x) = kL$ for all $k \in \mathbb{R}$,
  2. $\lim_{x \to c} [ f(x) + g(x)] = L + M$,
  3. $\lim_{x \to c} [f(x)g(x)] = LM$, and
  4. $\lim_{x \to c} f(x)/g(x) = L/M$, provided $M \neq 0$.

$^2$ (Characterizations of Continuity). Let $f : A \to \mathbb{R}$ and $c \in A$ be a limit point of $A$ [emphasis mine]. The function $f$ is continuous at $c$ if, and only if, any one of the following conditions is met:

  1. For all $\epsilon > 0$, there exists a $\delta > 0$ such that $|x-c| < \delta$ (and $x \in A$) implies $|f(x) – f(c)| < \epsilon$;
  2. $\lim_{x \to c} f(x) = f(c)$;
  3. For all $V_\epsilon (f(c))$, there exists a neighborhood $V_\delta (c)$ with the property that $x \in V_\delta(c)$ (and $ x \in A$) implies $f(x) \in V_\epsilon(f(c))$;
  4. If $(x_n) \to c$ (with $x_n \in A$), then $f(x_n) \to f(c)$.

For completeness, I will also write down the way the author defines functional limits and continuity:

Definition$^3$. Let $f: A \to \mathbb{R}$, and let $c$ be a limit point of the domain $A$. We say that $\lim_{x \to c} f(x) = L$ provided that, for all $\epsilon > 0$, there exists a $\delta > 0$ such that whenever $0 < |x-c| < \delta$ (and $x \in A$) it follows that $|f(x)-L| < \epsilon$.

Definition$^4$. A function $f: A \to \mathbb{R}$ is continuous at a point $c \in A$ if, for all $\epsilon > 0$, there exists a $\delta > 0$ such that whenever $|x-c| < \delta$ (and $x \in A$) it follows that $|f(x)-f(c)| < \epsilon$.

Best Answer

I will try to prove the Algebraic Continuity Theorem without using part 2 of Characterizations of Continuity; if I succeed, then I can conclude that for the Algebraic Continuity Theorem the point $c \in A$ does not need to be a limit point.

  1. In order to prove that $kf(x)$ is continuous at $c$, we need to find $\delta > 0$ such that: \begin{equation} |x-c| < \delta \implies |kf(x) - k f(c)| = |k||f(x)-f(c)| < \epsilon \end{equation} But since $f$ is assumed to be continuous, we can choose $\delta$ such that: \begin{equation} |x-c| < \delta \implies |f(x)-f(c)| < \frac{\epsilon}{|k|} \end{equation} and consequently: \begin{equation} |x-c| < \delta \implies |k||f(x)-f(c)| < |k| \frac{\epsilon}{|k|} = \epsilon \end{equation} as desired.
  2. We now need to prove that there is a $\delta > 0$ such that: \begin{equation} |x-c| < \delta \implies |f(x) + g(x)- (f(c)+ g(c))| < \epsilon \end{equation} Since $f$ is assumed to be continuous, we can choose a $\delta_1 > 0$ such that: \begin{equation} |x-c| < \delta_1 \implies |f(x)-f(c)| < \frac{\epsilon}{2} \end{equation} and since $g$ is assumed to be continuous, we can choose a $\delta_2 > 0$ such that: \begin{equation} |x-c| < \delta_2 \implies |g(x)-g(c)| < \frac{\epsilon}{2} \end{equation} Let us now set $\delta = \mathrm{min} \{ \delta_1, \delta_2 \}$. Then, using the triangle inequality, we can write: \begin{equation} \begin{aligned} |x-c| < \delta \implies |f(x) + g(x)- (f(c)+ g(c))|&=|(f(x)-f(c)) + (g(x) - g(c))| \\& \leq |f(x)-f(c)| + |g(x) - g(c)| \\& < \frac{\epsilon}{2}+\frac{\epsilon}{2}=\epsilon \end{aligned} \end{equation}
  3. Next, we need to prove that there exist a $\delta >0$ such that: \begin{equation} |x-c| < \delta \implies |f(x)g(x) - f(c)g(c) | < \epsilon \end{equation} Note that: \begin{equation} \begin{aligned} |f(x)g(x) - f(c)g(c)| & = |f(x)g(x) - f(c)g(x) + f(c)g(x) - f(c)g(c)| \\& \leq |f(x)g(x) - f(c)g(x)| + |f(c)g(x) - f(c)g(c)| \\& =|g(x)||f(x) - f(c)| + |f(c)||g(x) - g(c)| \end{aligned} \end{equation} Now, choose $\delta_1 > 0$ such that: \begin{equation} |x-c| < \delta_1 \implies |f(x)-f(c)| < \frac{1}{|M|} \frac{\epsilon}{2} \end{equation} where $|g(x)| \leq |M|$ for all $ x \in A$. Furthermore, choose $\delta_2 > 0$ such that: \begin{equation} |x-c| < \delta_2 \implies |g(x)-g(c)| < \frac{1}{|f(c)|} \frac{\epsilon}{2} \end{equation} Let us now set $\delta = \mathrm{min} \{ \delta_1, \delta_2 \}$. Then: \begin{equation} \begin{aligned} |f(x)g(x) - f(c)g(c)| & \leq |g(x)||f(x) - f(c)| + |f(c)||g(x) - g(c)| \\& \leq |M||f(x) - f(c)| + |f(c)||g(x) - g(c)| \\& < |M|\frac{1}{|M|} \frac{\epsilon}{2} + |f(c)|\frac{1}{|f(c)|} \frac{\epsilon}{2} = \epsilon \end{aligned} \end{equation} as desired.
  4. In order to prove this statement, we need to show that $1/g(x)$ is continuous. Thus, we need to show that there exists a $\delta >0$ such that: \begin{equation} |x-c| < \delta \implies \left| \frac{1}{g(x)} - \frac{1}{g(c)} \right| < \epsilon \end{equation} Note that: \begin{equation} \left| \frac{1}{g(x)} - \frac{1}{g(c)} \right| = \frac{1}{|g(x) g(c)|} |g(x)- g(c)| \end{equation} Since $g(x)$ is continuous, we can choose $\delta > 0$ such that: \begin{equation} |g(x) - g(c)| < |K| |g(c)| \epsilon \end{equation} where: \begin{equation} \frac{1}{|g(x)|} \leq \frac{1}{|K|} \end{equation} for all $x \in A$. Therefore: \begin{equation} \begin{aligned} \left| \frac{1}{g(x)} - \frac{1}{g(c)} \right| & \leq \frac{1}{|K g(c)|} |g(x)- g(c)| \\& < \frac{1}{|K g(c)|} |K| |g(c)| \epsilon = \epsilon \end{aligned} \end{equation} as desired.
Related Question