Prove / Disprove / Complete the proof That $f$ is Infinitely Differentiable

derivativesproof-writingreal-analysissmooth-functionssolution-verification


THE NEW (MUCH SHORTER) POST


My original post perhaps was too long for most people, and understandably not a lot of people tried to go over it. In this new post my question is much shorter please prove or disprove that following function is infinitely differentiable on $\mathbb{R}$, or alternatively answer my original post (it is below here), which basically means helping me completing my proposed proof. Even if you don't want to answer my original post because it 's too long, it might be worth at least looking at some of it, because it could be useful.

Let $E$ be some closed set of real numbers. Then, we define $f$ by 3 cases:
$\quad$(I) If $x \in E$, $f(x) = 0$.
$\quad$(II) If $x \in (a,b)$, where $a,b \in E$ and $(a,b) \subseteq \mathbb{R} \setminus E$, define $$F_{a,b}(x) = \frac{\pi}{b-a}\left(x + \frac{3b-5a}{2}\right),$$
$\quad$and then $$f(x) = X_{a,b}\cos^{F_{a,b}(x)}\left(F_{a,b}(x)\right),$$
$\quad$where $X_{a,b}$ is some constant real based on $a, b$, which you need to define appropriately (probably different for every $(a,b)$), to prove that $f$ is infinitely differentiable, or if you want to prove that $f$ is not infinitely differentiable, you need to show that this is true for every choice for $X_{a,b}$.
$\quad$(III) If $E$ is bounded above, $M=\sup E$, then $f(x)=e^{-\frac{1}{(x-M)^2}}$ for $x > M$. Similarly, $f(x)=e^{-\frac{1}{(x-N)^2}}$ if $E$ is bounded below and $\quad N=\inf E$.


MY ORIGINAL POST: Baby Rudin Ex. 5.21 – Assessment of a Proposed Partial Solution and Help Completing It


This exercise is driving me crazy.
I'm honestly ashamed to tell you how long I actually stack with this problem not continuing to study mathematics until I solve it. In many points a long the way I was very close to giving up and in many points I thought that I finally managed to solve the problem but then realized there is some mistake or a missing piece in the proof. The proposed solution here is the closest I managed to get at this point. At this point I decided it's time to ask for help on the last part missing from this proof and get some people to critic this partial proof in general.

The Problem: Ex. 21, Chap. 5

Let $E$ be a closed subset of $\mathbb{R}$. We saw in Exercise 22, Chap. 4, that there is a real continuous function $f$ on $\mathbb{R}$ whose zero set is $E$. Is it possible, for each closed set $E$, to find such an $f$ which is differentiable on $\mathbb{R}$, or one which is $n$ times differentiable, or even one which has derivatives of all orders on $\mathbb{R}$?

My Idea (Informally)

I think such an infinitely differentable function does exist. I want to give you a visual explanation of how I thought of constructing this function before going to the formal solution. We obviously set $f$ to be 0 for all points of $E$. For every open interval of points of $\mathbb{R} \setminus E$, where the end points are points of $E$, $f$ is a "wave" that smoothly approches 0 in the end points so that the derivative of any order on those end points is always 0. The shorter this open interval is, the smaller the maximum point of this "wave". If $E$ is bounded above, $f$ is just some increasing function after $\sup E$, so that it smoothly approches 0 at $\sup E$ (the same goes if $E$ is bounded below).
I drew an illustration of this idea:

A visual illustration of the function

The red parts are points of $f(E)$, the blue parts are points of $f(\mathbb{R} \setminus E)$. $P$ is a limit point of $E$.

My Partial Solution

My proof uses two previous results from the book and two additional lemmas. I shall present them here (and prove the lemmas) before going to the main proof.

Ex. 5.9. Let $f$ be a continuous real function on $\mathbb{R}$, of which it is known that $f'(x)$ exists for all $x \neq 0$ and that $f'(x) \to 3$ as $x \to 0$. Does it follow that $f'(0)$ exists?
In the solution to this exercise we saw that $f'(0)$ does in fact exists and is 3. In general to any real continuous function $f$ on $\mathbb{R}$ for which the derivative is known to exist except for a specific point $a$ and $f'(x) \to D$ as $x \to a$, we can use the same proof we use in this exercise to show that $f'(a)$ exists and $f'(a)=D$.

Theorem 4.15. If $f$ is a continuous mapping of a compact metric space $X$ into $\mathbb{R}^k$, then $f(X)$ is closed and bounded. Thus, $f$ is bounded.

Lemma 1. Let $f, g: \mathbb{R} \to \mathbb{R}$ differentiable functions. Then,
$$\left(f(x)^{g(x)}\right)' = f(x)^{g(x)}\left[g'(x)\ln(f(x)) + g(x)\frac{f'(x)}{f(x)}\right]$$
Proof. Let $h(x)=f(x)^{g(x)}$, then $$\ln(h(x)) = \ln\left(f(x)^{g(x)}\right) = g(x)\ln(f(x)).$$ Differentiating both sides we get: $$\frac{h'(x)}{h(x)} = g'(x)\ln(f(x)) + g(x)\frac{f'(x)}{f(x)}.$$ Thus, $$h'(x) = f(x)^{g(x)}\left[g'(x)\ln(f(x)) + g(x)\frac{f'(x)}{f(x)}\right].$$

Lemma 2. Let $f,g : \mathbb{R} \to \mathbb{R}$ infinitely differentiable functions. Then, for every $n \in \mathbb{N}$:
$\quad$(a) $f+g$ is infinitely differentiable and $$(f+g)^{(n)}(x)=f^{(n)}(x)+g^{(n)}(x)$$.
$\quad$(b) $f\cdot g$ is infinitely differentiable and $$(f\cdot g)^{(n)}(x)=\sum_{i=1}^{2^n} f^{\left(\alpha_{n_i}\right)}(x)\cdot g^{\left(\beta_{n_i}\right)}(x),$$ $\quad$where $\{\alpha_{n_i}\}$, $\{\beta_{n_i}\}$ are sequences of non-negative integers.
$\quad$(c) $\frac{f}{g}$ is infinitely differentiable whenever $g(x) \neq 0$ and $$\left(\frac{f}{g}\right)^{(n)}(x)=\frac{F_n(x)}{g^{2^n}(x)},$$ $\quad$where $F_n$ is some infinitely differentiable function.
$\quad$(d) $f\circ g$ is infinitely differentiable and $$(f\circ g)^{(n)}(x)=\sum_{j=1}^{z_n}\prod_{i=1}^{m_{n_j}}t_{n_{j_i}},$$ $\quad$for some $z_n \in \mathbb{N}$, $m_{n_1} and \dots, m_{n_{z_n}} \in \mathbb{N}$ and where each term $t_{n_{j_i}}$ is either of the form $f^{(m)}(g(x))$ or $g^{(m)}(x)$, $m$ a non-negative $\quad$integer.
Proof. We use induction for all 4 of the sections of this lemma.
$\quad$(a) For $n=1$, $(f+g)'(x) = f'(x) + g'(x)$. Suppose that $(f+g)^{(k)}(x)=f^{(k)}(x)+g^{(k)}(x)$ for some $k \in \mathbb{N}$. Then, $$(f+g)^{(k+1)}(x)=\left(f^{(k)}(x)\right)' + \left(g^{(k)}(x)\right)' = f^{(k+1)}(x) + g^{(k+1)}(x).$$
$\quad$(b) For $n=1$, $(f \cdot g)'(x) = f'(x)g(x) + f(x)g'(x)$. Suppose that for some $k \in \mathbb{N}$, there are sequences $\{\alpha_{k_i}\}$, $\{\beta_{k_i}\}$ of non-negative integers such that $(f\cdot g)^{(k)}(x)=\sum_{i=1}^{2^k} f^{\left(\alpha_{k_i}\right)}(x)\cdot g^{\left(\beta_{k_i}\right)}(x)$. Then, $$(f\cdot g)^{(k+1)}(x)=\sum_{i=1}^{2^k} f^{\left(\alpha_{k_i}+1\right)}(x)\cdot g^{\left(\beta_{k_i}\right)}(x) + f^{\left(\alpha_{k_i}\right)}(x)\cdot g^{\left(\beta_{k_i}+1\right)}(x).$$ Now, let $$\alpha_{\left(k+1\right)_i} = \begin{cases}
\alpha_{k_i+1}, \quad 1 \leq i \leq 2^k\\
\alpha_{k_{\left(i-2^k\right)}}, \quad 2^k + 1 \leq i \leq 2^{k+1},
\end{cases}$$

$$\beta_{\left(k+1\right)_i} = \begin{cases}
\beta_{k_i}, \quad 1 \leq i \leq 2^k\\
\beta_{k_{\left(i-2^k\right)}+1}, \quad 2^k + 1 \leq i \leq 2^{k+1}.
\end{cases}$$

Thus, $$(f\cdot g)^{(k+1)}(x) = \sum_{i=1}^{2^{k+1}}f^{\left(\alpha_{(k+1)_i}\right)}(x)\cdot g^{\left(\beta_{(k+1)_i}\right)}(x).$$
$\quad$(c) For $n=1$, $\left(\frac{f}{g}\right)'(x) = \frac{f'(x)g(x)+f(x)g'(x)}{g^2(x)}$. Suppose that for some $k \in \mathbb{N}$, $\left(\frac{f}{g}\right)^{(k)}(x) = \frac{F_k(x)}{g^{2^k}(x)}$, where $F_k$ is some infinitely differentiable function. Then, if we put $F_{k+1}(x) = F_k(x)g^{2^k}(x)-2^kg^{2^k-1}(x)g'(x)F_k(x)$, we get the desired result that $$\left(\frac{f}{g}\right)^{(k+1)}(x) = \frac{F_{k+1}(x)}{g^{2^{k+1}}(x)}.$$
$\quad$(d) For $n=1$, $(f \circ g)'(x) = f'(g(x))g'(x)$. Suppose that for some $k \in \mathbb{N}$, $(f\circ g)^{(k)}(x)=\sum_{j=1}^{z_k}\prod_{i=1}^{m_{k_j}}t_{k_{j_i}}$ for some $z_k \in \mathbb{N}$, $m_{k_1} and \dots, m_{k_{z_k}} \in \mathbb{N}$ and where each term $t_{k_{j_i}}$ is either of the form $f^{(m)}(g(x))$ or $g^{(m)}(x)$, $m$ a non-negative integer. Then, $(f\circ g)^{(k+1)}(x)=\sum_{j=1}^{z_k}\left(\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right)'$. We then prove by induction
$$\bullet \quad \left(\prod_{i=1}^{m_{k_j}}t_{k_{j_i}}\right)' = \sum_{i=1}^{m}\left(\frac{t_{k_{j_i}}'}{t_{k_{j_i}}}\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right) + \prod_{i=1}^{m}t_{k_{j_i}}\left(\prod_{i=m+1}^{m_{k_j}}t_{k_{j_i}}\right)',$$ for every $m,j \in \mathbb{N}$ such that $j \leq z_k$ and $m < m_{k_j}$.
For $m=1$, $\left(\prod_{i=1}^{m_{k_j}}t_{k_{j_i}}\right)' = t_{k_{j_1}}'\prod_{i=2}^{m_{k_j}}t_{k_{j_i}} + t_{k_{j_1}}\left(\prod_{i=2}^{m_{k_j}}t_{k_{j_i}}\right)'$. Suppose that $\bullet$ is true for some natural $m < m_{k_j}-1$. Then,
$$\begin{align*}
\left(\prod_{i=1}^{m_{k_j}}t_{k_{j_i}}\right)'
& = \sum_{i=1}^{m}\left(\frac{t_{k_{j_i}}'}{t_{k_{j_i}}}\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right) + \prod_{i=1}^{m}t_{k_{j_i}}\left[t_{k_{j_{m+1}}}'\left(\prod_{i=m+2}^{m_{k_j}}t_{k_{j_i}}\right) + t_{k_{j_{m+1}}}\left(\prod_{i=m+2}^{m_{k_j}}t_{k_{j_i}}\right)'\right]\\
& = \sum_{i=1}^{m+1}\left(\frac{t_{k_{j_i}}'}{t_{k_{j_i}}}\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right) + \prod_{i=1}^{m+1}t_{k_{j_i}}\left(\prod_{i=m+2}^{m_{k_j}}t_{k_{j_i}}\right)'.
\end{align*}$$

Then, if we put $m=m_{k_j}-1$ we get that
$$\left(\prod_{i=1}^{m_{k_j}}t_{k_{j_i}}\right)' = \sum_{i=1}^{m_{k_j}}\left(\frac{t_{k_{j_i}}'}{t_{k_{j_i}}}\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right).$$
Since $t_{k_{j_i}}'$ is either of the form $f^{(u)}(g(x))g'(x)$ or $g^{(u)}(x)$, $u \in \mathbb{N}$,
$$
(f \circ g)^{(k+1)}(x) = \sum_{j=1}^{z_k}\sum_{i=1}^{m_{k_j}}\left(\frac{t_{k_{j_i}}'}{t_{k_{j_i}}}\prod_{s=1}^{m_{k_j}}t_{k_{j_s}}\right)
$$

can clearly be arranged in our desired form.

The main proof.

Yes, such a function which is infinitely differentiable exists. We shall construct an example. Let $E \subseteq \mathbb{R}$ be closed.

Defining $f$
For the trivial case that $E=\emptyset$ take $f(x)=1$.
Otherwise, we define $f$ by 3 cases:

$\quad$(I) If $x \in E$, $f(x) = 0$.
$\quad$(II) If $x \in (a,b)$, where $a,b \in E$ and $(a,b) \subseteq \mathbb{R} \setminus E$, define $$F_{a,b}(x) = \frac{\pi}{b-a}\left(x + \frac{3b-5a}{2}\right),$$
$\quad$and then $$f(x) = X_{a,b}\cos^{F_{a,b}(x)}\left(F_{a,b}(x)\right),$$
$\quad$where $X_{a,b}$ is some constant real based on $a, b$, which for now is irrelevant. We shall define $X_{a,b}$ later in the proof when its value will be $\quad$relevant.
$\quad$(III) If $E$ is bounded above, $M=\sup E$, then $f(x)=e^{-\frac{1}{(x-M)^2}}$ for $x > M$. Similarly, $f(x)=e^{-\frac{1}{(x-N)^2}}$ if $E$ is bounded below and $\quad N=\inf E$.

Clearly $f$ is defined for every $x \in E$. $f$ is also defined for every $x \in \mathbb{R} \setminus E$ because such $x$ have some neighborhood in $\mathbb{R} \setminus E$, otherwise $x$ would be a limit point of $E$ and therefore contained in $E$.

$E$ is the zero-set of $f$
To show that $E=Z(f)$, we need to show that $x \not\in E$ implies $f(x) \neq 0$. For $x$ in case (II)
$$* \quad \frac{3}{2}\pi < F_{a,b}(x) < \frac{5}{2}\pi, $$
therefore $f(x)>0$. For $x$ in case (III) clearly $f(x)>0$ as well.

$f$ is infinitely differentiable
we treat points of $E$ and points of $\mathbb{R} \setminus E$ separately.

Points of $\mathbb{R} \setminus E$: $\quad$ For $x$ in case (II), $f(x)=X_{a,b}\cos^{F_{a,b}(x)}\left(F_{a,b}(x)\right)$. To derive $f'(x)$, we first note that
$$F_{a,b}'(x)=\frac{\pi}{b-a}$$ and using lemma 1
$$\left(\cos^x(x)\right)' = \cos^x(x)\left(\ln\left(\cos x\right)-x\tan x\right).$$
We now prove by induction that for every $n \in \mathbb{N}$, $f^{(n)}(x)$ exists and $f^{(n)}(x) = f(x)\cdot f_n(x)$, where $f_n$ is some infinitely differentiable function. For $n=1$, $$f'(x) = f(x)\left[\ln\left(\cos F_{a,b}(x)\right) – F_{a,b}(x)\tan F_{a,b}(x)\right]\frac{\pi}{b-a},$$
and we denote $f_1(x) = \frac{\pi}{b-a}\left[\ln\left(\cos F_{a,b}(x)\right) – F_{a,b}(x)\tan F_{a,b}(x)\right]$. From $*$ it follows that $\cos F_{a,b}(x) > 0$, hence $\ln\cos F_{a,b}(x)$ is well defined and is infinitely differentiable by lemma 2(d). Then, additionally using lemmas 2(a),(b),(d) we see that $f_1$ is infinitely differentiable. Suppose that for some $k \in \mathbb{N}$ $f^{(k)}(x)=f(x)f_k(x)$, where $f_k$ is some infinitely differentiable function. Then,
$$\begin{align*}
f^{(k+1)}(x)
&= f'(x)f_k(x) + f(x)f_k'(x)\\
&= f(x)(f_1(x)f_k(x) + f_k'(x)).
\end{align*}$$

Denoting $f_{k+1}(x) = f_1(x)f_k(x)+f_k'(x)$, it's clear from lemmas 2(b),(c) that $f_{k+1}$ is infinitely differentiable.
$\quad$ For $x$ in case (III), suppose without loss of generality that $E$ is bounded above and $x > M$. Then, since $e^x$ and $-\frac{1}{(x-M)^2}$ are infinitely differentiable (for $x \neq M$) we can use lemma 2(d) to get that $f(x)$ is infinitely differentiable.

Points of $E$: We shall describe the area "to the left" and "to the right" of $x$. Given $d > 0$, we denote $l_d=(x-d,x)$ and $r_d=(x,x+d)$. Then, we claim that there is some $\varepsilon_0>0$, such that at least one of the following must be true:

$\quad$(1)$\quad$ $l_{\varepsilon_0} \subseteq E$.
$\quad$(2)$\quad$ $l_{\varepsilon_0} \subseteq \mathbb{R} \setminus E$.
$\quad$(3)$\quad$ For all $0<\varepsilon\leq \varepsilon_0$, there's some $t \in l_\varepsilon$ such that $t \in E$.

And the same goes for "the right side" of $x$. To show that one of these options must be true, suppose by contradiction that they're all false. Then, since (3) is false, (2) is true and that's a contradiction.

$\quad$ We shall prove that for every non-negative integer $n$, $f^{(n)}(t) \to 0$ as $t \to x$. Then we use it to prove by induction that $f^{(n)}(x)=0$ and therefore $f^{(n)}(x)$ exists. Here is this induction proof:

For $n=0$ we already know that $f(x)=0$. Suppose that for some non-negative integer $k$, $f^{(k)}(x)=0$. Then, since $f^{(k)}(t) \to 0$ as $t \to x$ it follows that $f^{(k)}$ is continuous at $x$ and since we know that $f^{(k+1)}$ exists for points of $\mathbb{R} \setminus E$, $f^{(k)}$ is continuous. Then, since $f^{(k+1)}(t) \to 0$ as $t \to x$, using Ex.9 it follows that $f^{(k+1)}(x)=0$.

$\quad$If (1) is true, then $f$ is $0$ constant in $l_{\varepsilon_0}$, therefore for every $n \in \mathbb{N}$ $f^{(n)}$ is 0 constant as well. Thus, for every non-negative integer $n$, $f^{(n)}(t) \to 0$ as $t \to x$ for $x \in l_{\varepsilon_0}$. This argument of course can be applied to $r_{\varepsilon_0}$ as well.
$\quad$If (2) is true, then either $x=N=\inf E$ or $x=b$ for some $(a,b)$ as in case (II).
If $x=b$, denote $w(z)=X_{a,b}\cos^{F_{a,b}(z)}\left(F_{a,b}(z)\right)$, for $z \in [a,b]$ (note that $w=f$ on $[a,b]$). Then, since $w$ is continuous on $[a,b]$ (composition of continuous functions is continuous, $\lim_{t \to x}f(t) = 0$, for $t \in l_{\varepsilon_0}$. Since $w^{(n)}(t) = w(t)\cdot f_n(t)$ and $w^{(n+1)}(t)$ exists for all non-negative integer $n$ and $t \in [a,b]$, it follows that $w^{(n)}$ is continuous on $[a,b]$, therefore $f^{(n)}(t) \to w^{(n)}(x) = 0$ as $t \to x$ for $t \in l_{\varepsilon_0}$.
If $x=N$, then $f(t)=e^{-\frac{1}{(t-N)^2}}$, for $t \in l_{\varepsilon_0}$. Then, we show that $f^{(n)}(t) = f(t)\cdot p_n(t)$, where $p_n(t)$ is some polynomial of $\frac{1}{t-N}$. We use induction: For $n=1$, $f'(t) = f(t)\cdot 2\left(\frac{1}{t-N}\right)^3$. If for some $k \in \mathbb{N}$, $f^{(n)} = f(t)p_k(t)$, where $p_k(t) = \sum_{j=0}^{i}c_j\left(\frac{1}{t-N}\right)^j$, $i \in \mathbb{N}$ and $c_0, \dots, c_i \in \mathbb{R}$. Then,
$$\begin{align*}
f^{(k+1)}(t)
&= f(t)2\left(\frac{1}{t-N}\right)^3\sum_{j=0}^{i}c_j\left(\frac{1}{t-N}\right)^j + f(t)\sum_{j=0}^{i}jc_j\left(\frac{1}{t-N}\right)^{j-1}\\
&= f(t)\underbrace{\left[\sum_{j=0}^{i}2c_j\left(\frac{1}{t-N}\right)^{j+3} + \sum_{j=0}^{i}jc_j\left(\frac{1}{t-N}\right)^{j-1}\right]}_{\text{clearly a polynomial of } \frac{1}{t-N}}
\end{align*}$$

Now, if we prove that $\lim_{t \to N}\frac{f(t)}{|t-N|^m} = 0$, for every $m \in \mathbb{N}$, then clearly $f^{(n)}(t) \to 0$ as $t \to x$ for $t \in l_{\varepsilon_0}$.
$$
\lim_{t \to N}\frac{f(t)}{|t-N|^m} = \lim_{t \to N}\frac{e^{-\frac{1}{|t-N|^2}}}{|t-N|^m} = \lim_{h \to 0}\frac{e^{-\frac{1}{h^2}}}{h^m} = \lim_{n \to \infty}\frac{n^{\frac{m}{2}}}{e^n}.
$$

If $m=2z$, for some $z \in \mathbb{N}$, we apply l'Hôpital's rule $z$ times and get that
$$\lim_{t \to N}\frac{f(t)}{|t-N|^m} = \lim_{n \to \infty}\frac{z!}{e^n} = 0.$$
If $m=2z-1$, for some $z \in \mathbb{N}$, we apply l'Hôpital's rule $z$ times and get that
$$\lim_{t \to N}\frac{f(t)}{|t-N|^m} =
\lim_{n \to \infty}\frac{\left(\frac{m}{2}\right)\left(\frac{m}{2}-1\right)\cdots\frac{1}{2}}{\sqrt{n}e^n} = 0.$$

In a very similar way we can prove this for $r_{\varepsilon_0}$ when either $x=a$ or $x=M=\sup E$.

The missing piece from the proof

The last thing left to prove which I didn't mange to prove is that $f^{(n)}(t) \to 0$ as $t \to x$ for points $x \in E$ for which (3) is true. I should show how I tried to do it and what I did manage to show, that perhaps could be useful:
$\quad$Suppose (3) is true (but (1) is false). We should now define $X_{a,b}$. Let $n_{a,b} \in \mathbb{N}$ be the smallest integer such that $\frac{1}{n_{a,b}} < b-a$. Since $f_n$ is differentiable on $[a,b]$, $f_n$ is continuous on $[a,b]$, therefore by Theorem 4.15, there's some $M_n \in \mathbb{R}$ such that $\left|f_n(t)\right| \leq M_n$ for all $t \in [a,b]$ (notice that $f_n$ is independent of the value of $X_{a,b}$). Similarly, there is some $M_* \in \mathbb{R}$ such that $\left|\cos^{F_{a,b}(t)}\left(F_{a,b}(t)\right)\right| \leq M_*$ for all $t \in [a,b]$. Put $M_{a,b} = \max \{M_n|n \leq n_{a,b}\}$. Finally, we define
$$X_{a,b} = \frac{1}{n_{a,b}M_*M_{a,b}}.$$
We shall now use this definition of $X_{a,b}$ to demonstrate that for every $d > 0$, there is some $\delta > 0$ such that if $t \in l_\delta$ and $t \not\in E$, then $|f^{(n)}(t)|< d$. Given $d > 0$, there is some $x_d \in l_d$, such that $x_d \in E$. Put $\delta = x – x_d$. If $t \in l_\delta$ and $t \not\in E$, then $t \in (a,b)$ as in (II), where $b-a \leq \delta$. Thus
$$
|f^{(n)}(t)|=
X_{a,b}|f_n(t)|\left|\cos^{F_{a,b}(t)}\left(F_{a,b}(t)\right)\right|
\leq \frac{1}{n_{a,b}M_*M_{a,b}}M_*M_{a,b}
= \frac{1}{n_{a,b}}
< b-a \leq \delta
< d
.$$

For points $t$ for which (1) is true we know that $f^{(n)}(t) = 0$, so there's no problem there. But the problem is points for which only (3) is true. We don't know that they are 0 (we only think they suppose to be 0), but how can we prove this? It seems a bit circular.

What I'm Asking For

I have a few questions / things I want to get in an answer to this question:

  1. General assessment / critique of the incomplete proof (ignore the missing piece): I would like to get your opinion about every aspect of this proof – soundness, rigor, style, clarity and any other possible aspect you can think of.
    And most importantly is there some big unfixable mistake in this proof?
  2. Do you have a proposal to solve the missing piece? If needed feel free to change the definition of the coefficient $X_{a,b}$ as you see fit. If you think there's no way to solve this missing piece and this construction simply doesn't work, please don't suggest me a completely different solution to this exercise, I can look for solutions online, that's not a problem. Instead, please prove why this construction can't work for any choice of $X_{a,b}$.
  3. How long does such a question should take to solve for a first time analysis student? Does it make sense that a single exercise would take so much time and effort, even when we talk about Rudin?

Best Answer

Unfortunately, I think your proof will not work with your choice of function $f.$

To see this, consider the case $E=\mathbb{R}\setminus(0,1)$ so $a=0,b=1$ and on $(0,1)$ your function now becomes $$f(x)=\sin(\pi x)^{\pi x+ 3\pi/2}$$ (I set the constant $X_{a,b}=1,$ $f$ is infinitely differentiable if and only if $X_{a,b}^{-1}f$ is infinitely differentiable). The function vanishes on $E$ so all the left derivatives at $0$ are $0.$ Now assume towards a contradiction that $f$ is infinitely differentiable, then, by continuity, we must have $f^{(n)}(0)=0$ for all $n\in\mathbb{N}.$ By using Taylor expansion for the first five derivatives at $0$ we get $f(x)=o(|x|^5).$ Now observe that for some small $\delta>0$ and $x\in(0,\delta)$ we get the inequalities $$f(x)=\sin(\pi x)^{\pi x+ 3\pi/2}\geq (\pi x/2)^{\pi x+3\pi/2}\geq(\pi x/2)^{4.8}$$ which contradicts $f(x)=o(|x|^5).$

The same argument will work for arbitrary $a$ and $b,$ this is because you have constructed $F_{a,b}$ such that $F_{a,b}(a)=3 \pi/2$ so your $f$ will only vanish to this order at $a.$

I have not looked at where exactly the error is in your proof that $f$ is infinitely differentiable but I think your idea for $f$ can work. If you take $$f(x)=e^{-1/(x-a)^2}e^{-1/(x-b)^2}$$ on each interval $(a,b)$ of the complement $\mathbb{R}\setminus E$ you should get a smooth $f$ and the calculations should get a bit easier.

Related Question