Q1: The answer is no. If $g''(a)>0$ for some $a\in [0,1],$ then $g''>0$ in a neighborhood of $a$ by the continuity of $g''.$ Hence $g$ is strictly convex in that neighborhood. Similarly, if $g''(a)<0$ for some $a\in [0,1],$ then $g$ is strictly concave in that neighborhood. We're left with the case $f''\equiv 0.$ But this implies $f(x) = ax +b$ on $[0,1],$ hence $f$ is both convex and concave everywhere on $[0,1].$
Added later, in answer to the comment: It's actually possible for a $C^2$ function to have uncountably many inflection points. Suppose $K\subset [0,1]$ is uncountable, compact, and has no interior (the Cantor set is an example). Define
$$f(x)=\begin{cases}d(x,K)\sin (1/d(x,K)),&x\notin K\\ 0,& x\in K\end{cases}$$ Then $f$ is continuous, and $f$ takes on positive and negative values on any interval containing a point of $K.$ Define
$$g(x)=\int_0^x\int_0^t f(s)\,ds\,dt.$$
Then $g\in C^2[0,1]$ and $g''=f.$ It follows that every point of $K$ is an inflection point of $g.$
The idea is the following : look at the definition of an extremum. A point $x_0$ is an extremum if it is either a local minimum or a local maximum.
In other words, in a very small neighbourhood around $x_0$, the quantity $f(x) - f(x_0)$ has the same sign : if it is positive, then $f(x_0)$ is a minimum(because that's the same as saying that $f(x) \geq f(x_0)$ in the neighbourhood) else it is a maximum. If one of these happens then $x_0$ is an extremum.
Therefore, to show that a point is (or is not) an extremum , one merely must study the quantity $f(x) - f(x_0)$ in a small enough neighbourhood of $x_0$. We must essentially study the sign of $f(x) - f(x_0)$ : if it changes sign, then $f(x_0)$ is not an extremum. If it doesn't change sign, then depending upon which sign it takes we'd have a local minimum or maximum.
And so, on to the study of $f(x) - f(x_0)$.
You have mentioned that you know Taylor's theorem. Therefore, under the given conditions on the derivatives, if I write the expansion $f(x) = f(x_0) + \frac{f^{(n)}(x_0)}{n!}(x-x_0) + R_n(x,x_0)$ you should be comfortable with each of the terms, and acknowledge the fact that $\lim_{x \to x_0} \frac{R_n(x,x_0)}{(x-x_0)^n} = 0$.
Therefore, $f(x) - f(x_0) = \frac{f^{(n)}(x_0)}{n!}(x-x_0)^n + R_n(x,x_0)$. The left hand side is what we want to study : with this equality we may now study the sign changes in the right hand side!
For convenience, let us perform the following rewrite of the right hand side:
$$
\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n + R_n(x,x_0) = (x-x_0)^n\left(\frac{f^{(n)}(x_0)}{n!}+\frac{R_n(x,x_0)}{(x-x_0)^n}\right)
$$
The sign of the product, is the product of the signs! Therefore, we are restricted to the study of the sign of the two quantities on the right hand side.
For $(x-x_0)^n$, we note that the sign depends upon two things : if $n$ is even , then this is positive always. If $n$ is odd, then this is positive if and only if $x> x_0$. To finish analysis, note that in any neighbourhood of $x_0$, there's always a point which is bigger than $x_0$ : so conclusion would be "if $n$ is odd, there's a sign change of this expression : but if $n$ is even, there's no sign change".
For the other, we note that $\lim_{x \to x_0} \frac{R_n(x,x_0)}{(x-x_0)^n}=0$ by the Taylor expansion. Adding the $f^{(n)}$ part in (it's just a constant depending on $x_0$) we get :
$$
\lim_{x \to x_0} \frac{f^{(n)}(x_0)}{n!} + \frac{R_n(x,x_0)}{(x-x_0)^n} = \frac{f^n(x_0)}{n!}
$$
So we have a limit. Which is excellent, and you'll see why.
In your head, imagine a wild sequence, but still one converging to a positive number, say $0.01$. Can you prove rigorously why it must be positive after some time?
The answer would roughly be this : if the sequence is converging to $0.01$, then we can ensure the sequence goes as close to $0.01$ as we want, by going as far up the sequence as required! Now, if the sequence is very very close to $0.01$, then certainly it must continue to have positive terms, because you dragged it this side of $0$ by going far enough up the sequence.
The same thing happens here : note that $\frac{f^n(x_0)}{n!}$ is not zero, so has a sign. The limit ensures that the quantity , whose limit is being taken, will also get the same sign if we take large enough $N$.
The proof of this fact, I leave you to decipher from either the other answer or by the explanation. Or fine, take $\epsilon = \frac 12\left|\frac{f^{(n)}}{(x-x_0)^n}\right|$ then see what you get in the definition of limit existing.
To conclude, the second quantity we wanted to study will take on the same sign as $f^{n}(x_0)$ (since $n!$ is always positive) in a small enough neighbourhood of $x_0$. So it won't change sign!
And now we can finish :
If $n$ is odd, then the first quantity changes sign but the second doesn't : this means in total there is a sign change at $x_0$, so $x_0$ is not an extremum.
If $n$ is even, then neither quantity changes sign , so $f(x) -f(x_0)$ doesn't change sign, giving that $x_0$ is an extremum. Of course, then $f(x) -f(x_0)$ has the same sign as $f^{(n)}(x_0)$: so if $f^{(n)}(x_0)$ is positive, this is positive and we have a minima. Else, we have a maxima!
Best Answer
(1) For a counterexample to the converse of (a), just take $f(x)=x^3$.
(2) The example given shows that it doesn’t make sense to speak of $f’$ “changing sign” at the local minimum. The cases where it makes sense to speak of a sign change are those where $f’$ has constant sign on left and right sided deleted neighborhoods.