The stochastic differential equation solved by $y_t=\mathrm{e}^{\theta t}x_t$ indicates that
$$
\mathrm{e}^{\theta t}x_t = x_0 + \int_{0}^{t} \theta\, \mathrm{e}^{\theta s}\mu(s)\,\mathrm{d}s +\sigma \int_{0}^{t} \mathrm{e}^{\theta s}\, \mathrm{d}W_s,
$$
hence
$$
E(x_t) = x_0 \mathrm{e}^{-\theta t} + \int_{0}^t\theta\, \mathrm{e}^{\theta (s-t)}\mu(s)\,\mathrm{d}s.
$$
A more direct way to compute the expectation uses the fact that the function $u$ defined by $u(t)=E(x_t)$ is the unique solution of the ordinary differential equation
$$
u'(t)=\theta\cdot(\mu(t)-u(t)),\quad u(0)=x_0.
$$
@Nate: I think your argument is fine. I didn't know the Ferinique's theorem. My argument on that part was this one.
Fix $T>0$ (as you did eventually $T+1$). Applying Jensen we get:
$$\exp\left(\frac{1}{2}\int_S^{S+\epsilon} X_t^2\,dt\right)=\exp\left(\frac{1}{\epsilon}\int_S^{S+\epsilon} \frac{\epsilon}{2}X_t^2\,dt\right)\leq \frac{1}{\epsilon}\int_S^{S+\epsilon} \exp\left(\frac{\epsilon}{2}X_t^2\right)\,dt \qquad a.s..$$
Now taking the expectation and using Tonelli's theorem we have to study:
$$\frac{1}{\epsilon}\int_S^{S+\epsilon} E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]\,dt.$$
$X_t \sim N(\mu_t=X_0e^{-t}\,,\,\,\sigma_t^2=\frac{1-e^{-2t}}{2})$ so
$$E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]=E\left[\exp\left(\frac{\epsilon}{2}(\mu_t+\sigma_tZ)^2\right)\right]=$$
$$=e^{\frac{\epsilon}{2}\mu_t^2}\int_{\mathbb{R}} \frac{1}{\sqrt{2 \pi}}\, \exp\left(-\frac{x^2}{2}(1-\sigma_t^2 \epsilon)+\epsilon \mu_t \sigma_t x\right)dx. $$
Setting $\lambda_t=1-\epsilon \sigma_t^2=\frac{1}{2}[2-\epsilon(1-e^{-2t})]$, if $\lambda_t>0$ (for example with $\epsilon < 1$) the last integral is convergent and its value is:$$\exp\left(\frac {\epsilon}{2}\mu_t^2\right)\,\exp\left(\frac{\epsilon^2}{2 \lambda_t}\mu_t^2 \sigma_t^2\right)\frac{1}{\sqrt{\lambda_t}}.$$
Finally all the functions involved are continuous and since $\epsilon < 1$, $\lambda_t$ is away from 0 and so the moment generating function is integrable.
My first idea was to use instead of $\epsilon$ $T$, but moment generating function of the chi squared is not define to away from 0 but only in a neighbourhood.
Best Answer
You are right; the Ornstein-Uhlenbeck process is a Markov process but not a martingale. It is simply not correct that any Markov process is a martingale (and vica versa).
An easier counterexample is the following: Let $(B_t)_{t \geq 0}$ a Brownian motion and $$X_t := B_t +a \cdot t, \qquad t \geq 0$$ for some $a \in \mathbb{R}$, $a \neq 0$. Then $(X_t)_{t \geq 0}$ is not a martingale since, as $(B_t)_{t \geq 0}$ is a martingale,
$$\mathbb{E}(X_t \mid \mathcal{F}_s) = B_s +a \cdot t \neq X_s.$$
On the other hand,
$$\begin{align*} \mathbb{E}(f(X_t) \mid \mathcal{F}_s) &= \mathbb{E}(f(B_t-B_s+B_s+at) \mid \mathcal{F}_s) \\ &= \mathbb{E}(f(B_{t-s}+y+at)) \big|_{y=B_s} \\ &= \mathbb{E}(f(B_{t-s}+y+a(t-s)))) \big|_{y=X_s}, \end{align*}$$
and this shows that $(X_t)_{t \geq 0}$ is a Markov process.