No, your point is right. It's a bit mysterious why Mac Lane writes $F\mu c=\tau c$, as clearly the needed condition is $F\mu=\tau$, a condition which implies in particular that $FT_0=S$ and $FT_1=T$, using Mac Lane's notation, at least in the first edition. Mac Lane even writes the correct condition at the end of the paragraph.
Note that you reproduced some of the construction slightly incorrectly. $\mu$ is the function that assigns $(x,\uparrow)$ to each $x\in \mathcal C$, not to each $(x,\delta)$, because $\mu$ is a natural transformation with domain $\mathcal C$. Thus the claim you starred should also have been that $F(\mu(x))=\tau x$, not $F(\mu(x,\delta))=\tau x$.
Your proof is fine enough as far as it goes, namely showing bijection but not yet naturality. I don't know if it is intentional or not, but you don't actually show surjectivity, just state what you would need to do. I would, however, recommend instead of proving bijection by showing injectivity and surjectivity to instead simply explicitly write the inverse. That is, define bijection as an isomorphism in $\mathbf{Set}$. This is typically more useful and leads to nicely calculational proofs.
I'm pretty sure Mac Lane introduces the notation $\mathcal{C}(X,Y)$ generically for any category $\mathcal{C}$ to stand for the hom-set. I'm also fairly certain that Mac Lane defines what the arrows of $\mathbf{Cat}$ is, namely functors. For broader context, though I don't think it comes up in Mac Lane, $\mathbf{Cat}$, particularly when defined as all locally small categories and not just small categories, is an archetypal example of a 2-category and so often $\mathbf{Cat}(\mathcal C,\mathcal D)$ will be the hom-category of the 2-category $\mathbf{Cat}$. Again, that's not happening here.
A left adjoint is the dual concept to a right adjoint, but a functor that has a right adjoint is a left adjoint (and vice versa). So a definition of right adjoint is a definition of left adjoint. It is clear that you don't need to worry about it right now.
Mac Lane defined the action of hom-functors on arrows, i.e. $\mathcal C(f,B)(h) = h\circ f$ and $\mathcal C(A,g)(h)=g\circ h$. I'm pretty sure he defined the action of $\times$ as a (bi)functor, though you could likely correctly guess it anyway. Presumably, he defined the action of $C^B$ in each argument (though this is somewhat unnecessary in a way). The relevant functors are just compositions of these. You have mostly gotten it except you should write $T_1(F)(a,b)$ not $T_1(F(a,b))$. $T_1$ (and similarly for $T_2$) is a function which takes a functor $F$ and outputs a functor. $F(a,b)$ is an object so $T_1(F(a,b))$ would imply that $T_1$ takes an object. You know these are the right choices because these are the definition of the action of the relevant functors on arrows.
The point of the exercise is presumably to start getting you familiar with proving things like this and to start spelling out the properties of $C^B$. The point of the theorem is that it shows that $\mathbf{Cat}$ is cartesian closed (well, technically you also need the existence of a terminal object) which is indeed basically that $-\times B$ is a left adjoint for all $B$. Adjoint functors are one of the most important and ubiquitous concepts in category theory and knowing something is a left/right adjoint immediately entails a lot of other nice properties. This will be elaborated on throughout the book as almost all results will implicitly or explicitly related to adjoint functors. One particularly important aspect is that left/right adjoints are determined up to unique isomorphism and more particularly given the action on objects of a functor which is left/right adjoint to a given functor, you can calculate what the action on arrows must be. This means that adjoint functors have a kind of definitional power. What an exponent is, i.e. what the notation $(-)^B$ means in general, is exactly that $(-)^B$ is right adjoint to $(-)\times B$. Indeed, a cartesian closed category is exactly a category equipped with three functors whose only relationships are given by adjoint functors.
Finally, I should say that I don't actually recommend Categories for the Working Mathematician, especially as an introduction. It is not very well organized or systematic, and, as the title suggests, it is not aimed at a "student". It also omits or only sketchily covers content that I think is very important, though admittedly this is a flaw in virtually all introductions currently. There are many freely available or otherwise reasonably readily available introductions. I recommend Awodey's Category Theory and Barr and Wells' ESSLLI notes. But I really recommend just reading multiple sources and getting multiple perspectives. If one book seems not to be working for you (though you should expect a good amount of struggle regardless of the book), then try another and return to the first later. I think it is helpful to read Mac Lane at some point, but it is probably better read after going through some other introduction. The main gap of virtually all of these resources is a discussion of (co)ends which is well handled by Fosco Loregian's notes, but those assume familiarity with basic category theory.
Best Answer
Let's show naturality in all three arguments. We can just verify this using the definitions. I will use equations instead of commutative diagrams (both approaches are valid, and I present this one here since it is lesser known but has some advantages as well).
Let me denote the projection onto the $i$th factor by $p_i$, regardless of which product is used (this will always be clear from the context).
Then $\tau : (a \times b) \times c \to a \times (b \times c)$ is defined by the three equations $$\begin{align*} p_1 \tau &= p_1 p_1 \\ p_1 p_2 \tau &= p_2 p_1 \\ p_2 p_2 \tau &= p_2\end{align*}$$ This is a bit short notation, but it is unambiguous nonetheless. For example, $p_1 p_2 \tau = p_2 p_1$ means that
$$(a \times b) \times c \xrightarrow{\tau} a \times (b \times c) \xrightarrow{p_2} b \times c \xrightarrow{p_1} b$$
is equal to
$$(a \times b) \times c \xrightarrow{p_1} a \times b \xrightarrow{p_2} b.$$
The three equations define $\tau$ completely by the universal property (=definition) of products in a category (the first one defines $p_1 \tau$, the second and third one define $p_2 \tau$, hence we have $\tau$).
An even much more concise definition of $\tau$ uses generalized elements, because then the equations above simply translate to the usual formula $\tau((x,y),z)=(x,(y,z))$, which also makes everything here (also the construction of $\tau$ itself) trivial, but for the sake of learning let's ignore this path for now, and I just wanted to mention it.
If $f : a \to a'$, $g : b \to b'$, $h : c \to c'$ are morphisms, then $$(f \times g) \times h : (a \times b) \times c \to (a' \times b') \times c'$$ is defined by the three equations $$\begin{align*} p_1 p_1 ((f \times g) \times h) & = f p_1 p_1\\ p_2 p_1 ((f \times g) \times h) & = g p_2 p_1\\ p_2 ((f \times g) \times h) & = h p_2 \end{align*}$$
The definition of $f \times (g \times h)$ is completely analogous and also consists of three equations, which I omit here. The isomorphism $\tau'$ is defined in the same way as $\tau$, just with $a',b',c'$ instead of $a,b,c$.
The claimed naturality is just the equation
$$(f \times (g \times h)) \circ \tau = \tau' \circ ((f \times g) \times h)$$ of morphisms $(a \times b) \times c \to a' \times (b' \times c')$.
As before it suffices to post-compose the morphisms with $p_1$ and $p_1 p_2$ and $p_2 p_2$ to prove their equality (again, by the universal property, i.e. definition of products).
So we have to verify three equations. And guess what, they of course follow from our definitions / equations before! It is just a matter of writing it down. The computation is so straight forward that it could even be done (and formalized) by computer algebra software these days.
Let us denote the definitions of $\tau$ (likewise $\tau'$), $(f \times g) \times h$, $f \times (g \times h)$, by $\mathrm{D1}, \mathrm{D2}, \mathrm{D3}$.
First equation ($p_1$):
$$\begin{align*} p_1 \, (f \times (g \times h)) \, \tau & \stackrel{\mathrm{D3}}{=} f p_1 \tau \\ & \stackrel{\mathrm{D1}}{=} f p_1 p_1 \\ & \stackrel{\mathrm{D2}}{=} p_1 p_1 ((f \times g) \times h) \\ & \stackrel{\mathrm{D1'}}{=} p_1 \tau' ((f \times g) \times h) \end{align*}$$
Second equation ($p_1 p_2$):
$$\begin{align*} p_1 p_2 (f \times (g \times h)) \, \tau & \stackrel{\mathrm{D3}}{=} g p_1 p_2 \tau \\ & \stackrel{\mathrm{D1}}{=} g p_2 p_1 \\ & \stackrel{\mathrm{D2}}{=} p_2 p_1 ((f \times g) \times h) \\ & \stackrel{\mathrm{D1'}}{=} p_1 p_2 \tau' ((f \times g) \times h) \end{align*}$$
Third equation ($p_2 p_2$):
$$\begin{align*} p_2 p_2 (f \times (g \times h)) \, \tau & \stackrel{\mathrm{D3}}{=} h p_2 p_2 \tau \\ & \stackrel{\mathrm{D1}}{=} h p_2 \\ & \stackrel{\mathrm{D2}}{=} p_2 ((f \times g) \times h) \\ & \stackrel{\mathrm{D1'}}{=} p_2 p_2 \tau' ((f \times g) \times h) \end{align*}$$