I think I have the result that you are looking for. It is called "Föllmer's lemma" (theorem 2.44 chapter II section 5 in the book "Semimartingale Theory and Stochastic Calculus" by Gang Wang, Jia-An Yan and here is how it goes :
(of course you are given a stochastic basis with no particular assumptions regarding the regularity of the filtration)
Let's be given $(X_t)_{t\in \mathbb{R}^+}$ a supermartingale (resp. martingale) and $D$ a dense subset of $\mathbb{R}^+$.
Then there exists a process $\bar{X}_t$ such that the following four points hold :
1-$(\bar{X}_t)_{t\in \mathbb{R}^+}$ is right continuous for almost all $\omega$, for any $t\in \mathbb{R}^+$ ; $\bar{X_t}(\omega)=\lim_{(s\in D,s \searrow \searrow t )}X_s(\omega)$.
2-for almost all $\omega$, for any $t>0$,
$\bar{X}_{t_-}(\omega)=\lim_{(s\in \mathbb{R}^+,s \nearrow \nearrow t )}\bar{X}_s(\omega)$ exists and is finite. And we have moreover :
$\bar{X}_{t_-}(\omega)=\lim_{(s\in D,s \nearrow \nearrow t )}X_s(\omega)$
3- For all $t\in\mathbb{R}^+$ :
$X_t\geq \mathbb{E}[\bar{X}_t|\mathcal{F}_t]$ (resp. equality for the martingale case)
4- $(\bar{X}_t)_{t\in \mathbb{R}^+}$ is a $\mathcal{F}_+$-supermartingale (resp. martingale). Where $\mathcal{F}_+$ is the right continuous filtration generated by the filtration $\mathcal{F}$.
As you can notice there is no completion of the filtration at time 0 by the negligible sets.
Well, in the form stated above, none of the statements are true, because you're only assuming $f$ to be progressive and not predictable, and you're not assuming that the integrator $X$ has continuous sample paths.
I'd say that point (4) is neither true nor false but undefined, as the stochastic integral is not necessarily well-defined for integrands which are only progressive and not predictable.
As regards the other three points, as a counterexample, take e.g. $N$ to be a standard Poisson process and let $f(t) = N_t$, $X_t = N_t - t$ and let $(\mathcal{F}_t)$ be the filtration induced by $N$. Then $f$ is locally bounded (by e.g. the sequence of stopping times corresponding to the jump times of $N$), bounded on compacts (because it has cadlag sample paths) and is progressive (because it is cadlag and adapted). Furthermore, the integral is well-defined since $X$ has sample paths of finite variation, so the integral can be defined as a pathwise Lebesgue integral. It holds that
$$
Y_t = \int_0^t f(s) dX_s = \int_0^t (N_{s-} + \Delta N_s) dX_s \\
= \int_0^t N_{s-}dX_s + \sum_{0<s\le t}(\Delta N_s)^2
= \int_0^t N_{s-}dX_s + N_t.
$$
This functions as a counterexample for points (1-3) because even though $X$ is a locally $L^2$-bounded martingale, $Y$ is not even a local martingale. The problem is that $f$ is not predictable. See also this question for more on this.
If $f$ was assumed predictable, the answers would be:
(1): True. Intuitively, this is because the integral of a predictable process with respect to a local martingale is a martingale, and if $f$ is sufficiently rough, the integral process will not yield integrability, and so a true martingale cannot be expected.
(2): True. Intuitively, this is because the integral process is a local martingale, and by localising so that $f$ is bounded and $X$ is $L^2$-bounded, one obtains $L^2$ boundedness of the integral process.
(3): True. This is almost a defining property of the stochastic integral (depending on the method of construction), but certainly true in any case.
(4): True, also almost by construction, depending on the method of construction.
Best Answer
I decided to rewrite the answer almost completely since @Byron's hint allows me to make the answer more clear (I hope). First of all, thanks for the question - this is something really enlightening the notion of the martingale and the conditional expectation.
Second, some motivation - inspired by the comment of @TheBridge, I tried first to realize what does $X_t(\omega)$ mean. Well, $$ X:(t,\omega)\in[0,1]\times [0,1]\mapsto X_{t}(\omega)\in\mathbb R $$ so for each moment $t\in[0,1]$ the random vairable $X_t$ is a function on $[0,1]$ in $\omega$ which is $\mathscr F_t$-measurable. Somehow this function should depend only on values of $f(\omega)$ for $\omega\in [0,t]$ and be the best approximation of the latter function. In particular, $\mathsf EX_t = \mathsf Ef<\infty$ for any $t$. The finiteness of the expectation holds due to the fact that $f\in L^1([0,1])$.
Let me put the answer in two steps: first, using the definition of the conditional expectation we will guess the shape of $X_t$ and then prove that it is indeed what we need. Before let's prove one important fact about the filtration. Let $\mathsf P$ denote the Lebesgue measure and $$ \mathscr N = \{F\in \mathscr B([0,1]):\mathsf P(F) = 0\} $$ be the class of all null-sets, so $\mathscr F_t = \sigma(\mathscr B([0,t])\cup \mathscr N)$.
Claim: for any $F\in \mathscr F_t$ it holds that $$ \mathsf P(F\cap (t,1]) \in \{0,1-t\}.\tag{1} $$ Proof: first of all, let's denote $\mathscr F^\prime_t$ to be all elements of $\mathscr F_t$ satisfying $(1)$. Then clearly $$ \mathscr B([0,t])\cup \mathscr N\subset \mathscr F^\prime_t $$ and in that case the probability in $(1)$ is always zero. To finish the prove we only need to show that if $F,(F_n)_{n\geq 0}\in \mathscr F^\prime_t$ then $F^c,\bigcup F_n\in \mathscr F^\prime_t$ which is an easy task. As a result, $\mathscr F^\prime_t$ is a $\sigma$-algebra which contains $\mathscr B([0,t])\cup \mathscr N$ hence $\mathscr F^\prime_t = \mathscr F_t$ and all elements of the latter admit $(1)$.
By the definition of the conditional expectation, $X_t$ has to be $\mathscr F_t$-measurable and for any $F\in\mathscr F_t$ we should have $$ \int\limits_F (f-X_t)d\mathsf P = 0.\tag{2} $$ Since $\mathscr B([0,t])\subset\mathscr F_t$ it gives us a hint that $X_t(\omega) = f(\omega)$ $\mathsf P$-a.e. on $[0,t]$. With regards to $\omega\in (t,1]$ - from our claim it follows that $X_t(\omega)$ is constant $\mathsf P$-a.e. on $(t,1]$. Indeed, if it would not be true, then there should exist $r\in \mathbb R$ such that $$ \mathsf P(X_t^{-1}((-\infty,r))\cap (t,1])>0\text{ and }\mathsf P(X_t^{-1}([r,\infty))\cap (t,1])>0 $$ which contradicts with the claim since $X_t^{-1}((-\infty,r))$ and $ (t,1]$ are in $\mathscr F_t$. This constant we can find from the condition that $\mathsf EX_t = \mathsf Ef$ so the result is: $$ X_t(\omega) = f(\omega)1\{\omega\leq t\}+\frac{1}{1-t}\left(\int\limits_t^1f(\omega)d\omega\right)\cdot 1\{\omega >t\} $$ for all $0\leq t<1$.
To finish the answer, let us verify that $X_t$ is indeed $\mathscr F_t$-measurable and that $(2)$ holds.
First, the $\mathscr F_t$-measurability of $\{\omega:\omega\leq t\} = [0,t]$ and of its complement $(t,1]$ is clear. Let us consider $f(\omega)1\{\omega\leq t\}$. For $A\in \mathscr B(\mathbb R)$ we have: $$ \{\omega:f(\omega)1\{\omega\leq t\}\in A\} = \begin{cases} f^{-1}(A)\cap[0,t], &\text{ if }0\notin A \\ f^{-1}(A)\cap[0,t]\cup(t,1], &\text{ if }0\in A. \end{cases} $$ and so $X_t$ is $\mathscr F_t$-measurable as a linear combination of measurable functions.
To check $(1)$, we pick up any $F\in \mathscr F_t$ and $$ \int\limits_F (f(\omega)-X_t(\omega))d\omega = \int\limits_{B\cap (t,1]}f(\omega)d\omega - \frac{1}{1-t}\mathsf P(B\cap (t,1])\cdot\int\limits_t^1 f(\omega)d\omega = 0 $$ as it follows from the claim.