As @Did already pointed out, it is much easier to show that
$$a W_t - \frac{1}{2} a^2 t \to - \infty \qquad \text{almost surely as $t \to \infty$.}$$
Recall that the process $$B_t := \begin{cases} t W_{\frac{1}{t}}, & t>0, \\ 0, & t=0 \end{cases}$$ defines a Brownian motion; in particular
$$\lim_{t \to 0} t W_{\frac{1}{t}} = \lim_{t \to 0} B_t = 0.$$
This implies
$$\lim_{t \to \infty} \frac{W_t}{t}= \lim_{s \to 0} s W_{\frac{1}{s}} = 0.$$
Hence,
$$a W_t - \frac{1}{2} a^2 t = t \underbrace{\left( a \frac{W_t}{t} -\frac{a^2}{2} \right)}_{\to - \frac{a^2}{2}} \to - \infty$$
almost surely as $t \to \infty$.
Itô's formula shows that
$$M_t = 1+ \int_0^t f(s) M_s \, dW_s \tag{1}$$
and this implies, in particular, that $(M_t)_{t \geq 0}$ is a local martingale. (Note that $(M_t)_{t \geq 0}$ has continuous sample paths, and therefore the stochastic integral on the right-hand side is well-defined.) On the other hand, it follows from the very definition that $M_t \geq 0$ for each $t \geq 0$. Since any non-negative local martingale is a supermartingale (see e.g. this question for details), we conclude that $(M_t)_{t \geq 0}$ is a supermartingale. Thus,
$$\mathbb{E}(M_t) \leq \mathbb{E}(M_0) \leq 1$$
which implies
$$\mathbb{E} \exp \left( \int_0^t f(s) \, dW_s \right) \leq \exp \left( \frac{1}{2} \int_0^t f(s)^2 \, ds \right)$$
for each $t \geq 0$. Replacing $f$ by $2f$ we find in particular that
$$\mathbb{E} \left| \exp \left( \int_0^t f(s) \, dW_s \right) \right|^2 = \mathbb{E}\exp \left( \int_0^t 2f(s) \, dW_s \right) \leq \exp \left( 2 \int_0^t f(s)^2 \, ds \right) < \infty,$$
and so
$$\mathbb{E}(M_t^2) \leq \exp \left( \int_0^t f(s)^2 \, ds \right).$$
Using this estimate and the fact that $f$ is deterministic, we can easily check that the stochastic integral $\int_0^t f(s) M(s) \, dW_s$ is a true martingale, and now $(1)$ shows that $(M_t)_{t \geq 0}$ is a martingale.
Best Answer
Let $\{{\mathcal F}_t\}_{t \ge 0}$ denote the Brownian filtration determined by $W$ and suppose that the Brownian motion $B$ is independent of $W$. The definition of conditional expectation implies that $$M_t={\mathbb E}[f(W_t+B_{1-t})^2 | W_t]= {\mathbb E}[f(W_t+B_{1-t})^2 |{\mathcal F}_t] \,.$$ Therefore, for $s<t$ we have $$ {\mathbb E}[M_t |{\mathcal F}_s] ={\mathbb E}[f(W_t+B_{1-t})^2 |{\mathcal F}_s] ={\mathbb E}[f(W_s+W_t-W_s+B_{1-t})^2|{\mathcal F}_s] \,.$$ The Gaussian variable $W_t-W_s+B_{1-t}$ is independent of ${\mathcal F}_t$, and has the same $N(0,1-s)$ distribution as $B_{1-s}$, which is also independent of ${\mathcal F}_s$. Thus $$ {\mathbb E}[M_t |{\mathcal F}_s] = {\mathbb E}[f(W_s+ B_{1-s})^2 |{\mathcal F}_s] =M_s\,.$$