Hint Let $\varepsilon>0$, $\delta>0$. We have
$$\begin{align*} \mathbb{P} &\left( \left| \frac{1}{B_{\varepsilon}} \cdot \int_0^{\varepsilon} H_s \, dB_s - H_0 \right|>\delta \right) \\ &= \mathbb{P} \left( \left| \frac{1}{B_{\varepsilon}} \cdot \int_0^{\varepsilon} (H_s-H_0) \, dB_s \right|>\delta \right) \\
&\leq \mathbb{P} \left( \left| \frac{1}{B_{\varepsilon}} \cdot \int_0^{\varepsilon} (H_s-H_0) \, dB_s \right|>\delta, \left| \frac{\sqrt{\varepsilon}}{B_{\varepsilon}} \right| \leq K \right)+ \mathbb{P} \left( \left| \frac{\sqrt{\varepsilon}}{B_{\varepsilon}} \right| > K \right) \\
&\leq \mathbb{P} \left( \left| \frac{1}{\sqrt{\varepsilon}} \cdot \int_0^{\varepsilon} (H_s-H_0) \, dB_s \right|>\frac{\delta}{K} \right)+ \mathbb{P} \left( \frac{|B_{\varepsilon}|}{\sqrt{\varepsilon}} < \frac{1}{K} \right)\\
&=: I_1+I_2 \end{align*}$$
for any $K>0$. Since $\frac{B_{\varepsilon}}{\sqrt{\varepsilon}} \sim N(0,1)$, we can choose $K>0$ (independent of $\varepsilon$) such that
$$I_2 \leq \frac{\varepsilon}{4}$$
For the first term $I_1$ apply Markov's inequality and Itô's isometry to show that it converges to zero as $\varepsilon \to 0$, using the continuity of $H$ at $0$.
Remark A detailed proof can be found in Dean Isaacson, Stochastic Integrals and Derivatives (1969).
Well, in the form stated above, none of the statements are true, because you're only assuming $f$ to be progressive and not predictable, and you're not assuming that the integrator $X$ has continuous sample paths.
I'd say that point (4) is neither true nor false but undefined, as the stochastic integral is not necessarily well-defined for integrands which are only progressive and not predictable.
As regards the other three points, as a counterexample, take e.g. $N$ to be a standard Poisson process and let $f(t) = N_t$, $X_t = N_t - t$ and let $(\mathcal{F}_t)$ be the filtration induced by $N$. Then $f$ is locally bounded (by e.g. the sequence of stopping times corresponding to the jump times of $N$), bounded on compacts (because it has cadlag sample paths) and is progressive (because it is cadlag and adapted). Furthermore, the integral is well-defined since $X$ has sample paths of finite variation, so the integral can be defined as a pathwise Lebesgue integral. It holds that
$$
Y_t = \int_0^t f(s) dX_s = \int_0^t (N_{s-} + \Delta N_s) dX_s \\
= \int_0^t N_{s-}dX_s + \sum_{0<s\le t}(\Delta N_s)^2
= \int_0^t N_{s-}dX_s + N_t.
$$
This functions as a counterexample for points (1-3) because even though $X$ is a locally $L^2$-bounded martingale, $Y$ is not even a local martingale. The problem is that $f$ is not predictable. See also this question for more on this.
If $f$ was assumed predictable, the answers would be:
(1): True. Intuitively, this is because the integral of a predictable process with respect to a local martingale is a martingale, and if $f$ is sufficiently rough, the integral process will not yield integrability, and so a true martingale cannot be expected.
(2): True. Intuitively, this is because the integral process is a local martingale, and by localising so that $f$ is bounded and $X$ is $L^2$-bounded, one obtains $L^2$ boundedness of the integral process.
(3): True. This is almost a defining property of the stochastic integral (depending on the method of construction), but certainly true in any case.
(4): True, also almost by construction, depending on the method of construction.
Best Answer
No, it's not exactly obvious. Let's prove the following theorem.
Proof: Througout, $(N_t)_{t \geq 0}$ is an $L^2$-bounded martingale. First we consider the particular case that $K$ is a simple process of the form $$K(s) = \sum_{j=0}^{N-1} \varphi_j 1_{(s_j,s_{j+1}]}(s) \tag{1}$$ where $0<s_0 < \ldots < s_N$ and $\varphi_j \in L^2(\mathcal{F}_{s_j})$. We have to show that $M_t = \int_0^t K_r \, dB_r$ satisfies $$\mathbb{E}((M_t-M_s) (N_t-N_s) \mid \mathcal{F}_s) = \mathbb{E} \left( \int_s^t K_r \, d\langle B,N \rangle_r \mid \mathcal{F}_s \right) \tag{2}$$ for any fixed $s \leq t$. Without loss of generality, we may assume that $s_N = t$ and that there exists $k \in \{0,\ldots,N\}$ such that $s_k = s$ (otherwise we refine the partition accordingly). Writing $$N_t-N_s = \sum_{i=k}^{N-1} (N_{s_{i+1}}-N_{s_i}) \quad \text{and} \quad M_t-M_s = \sum_{j=k}^{N-1} (M_{s_{j+1}}-M_{s_j})$$ we find
$$\mathbb{E}((M_t-M_s) (N_t-N_s) \mid \mathcal{F}_s) = \sum_{j=k}^{N-1} \sum_{i=k}^{N-1} \mathbb{E}((M_{s_{j+1}}-M_{s_j}) (N_{s_{i+1}}-N_{s_i}) \mid \mathcal{F}_s).$$
Since both $(M_t)_{t \geq 0}$ and $(N_t)_{t \geq 0}$ are martingales, it is not difficult to see from the tower property that the terms on the right-hand side vanish for $i \neq j$, and so $$\mathbb{E}((M_t-M_s) (N_t-N_s) \mid \mathcal{F}_s) = \sum_{j=k}^{N-1} \mathbb{E}(\varphi_j (B_{s_{j+1}}-B_{s_j}) (N_{s_{j+1}}-N_{s_j}) \mid \mathcal{F}_s).$$
Using once more the tower property, we get
$$\begin{align*} \mathbb{E}((M_t-M_s) (N_t-N_s) \mid \mathcal{F}_s) &= \sum_{j=k}^{N-1} \mathbb{E} \bigg[ \varphi_j \mathbb{E}((B_{s_{j+1}}-B_{s_j}) (N_{s_{j+1}}-N_{s_j}) \mid \mathcal{F}_{s_j}) \mid \mathcal{F}_s \bigg] \\ &= \sum_{j=k}^{N-1} \mathbb{E} \bigg[ \varphi_j \mathbb{E}(\langle B,N \rangle_{s_{j+1}}-\langle B,N \rangle_{s_j}) \mid \mathcal{F}_{s_j}) \mid \mathcal{F}_s \bigg]\\ &= \mathbb{E} \left( \sum_{j=0}^{N-1} \varphi_j (\langle B,N \rangle_{s_{j+1}}-\langle B,N \rangle_{s_j}) \mid \mathcal{F}_s \right) \\ &= \mathbb{E} \left( \int_s^t K_r \, d\langle B,N \rangle_r \mid \mathcal{F}_s \right) \end{align*}$$
This proves the assertion for the simple process $K$. For the general case we choose a sequence of simple process $(K_n)_{n \in \mathbb{N}}$ of the form $(1)$ such that $$\mathbb{E} \left( \int_0^t (K_n(s)-K(s))^2 \, ds \right) \to 0.$$ By the construction of the stochastic integral, this implies, in particular,
$$ \int_0^t K_n(r) \, dB_r \to \int_0^t K(r) \, dB_r \quad \text{in $L^2(\mathbb{P})$} \tag{3}$$
On the other hand, it follows from the Cauchy-Schwarz inequality that
$$\int_0^t K_n(r) \, d\langle B,N \rangle_r \to \int_0^t K(r) \, d\langle B,N \rangle_r \quad \text{in $L^2(\mathbb{P})$} \tag{4}$$
Thus, $M_t = \int_0^t K(r) \, dB_r$ satisfies
$$\begin{align*} \mathbb{E} \left( (M_t-M_s) (N_t-N_s) \mid \mathcal{F}_s \right) &\stackrel{(3)}{=} \lim_{n \to \infty} \mathbb{E} \left[ \left( \int_0^t K_n(r) \, dB_r - \int_0^s K_n(r) \, dB_r \right) (N_t-N_s) \mid \mathcal{F}_s \right] \\ &\stackrel{(1)}{=} \lim_{n \to \infty} \mathbb{E} \left( \int_s^t K_n(r) \, d\langle B,N \rangle_r \mid \mathcal{F}_s \right) \\ &\stackrel{(4)}{=} \mathbb{E} \left( \int_s^t K(r) \, d\langle B,N \rangle_r \mid \mathcal{F}_s \right). \end{align*}$$