No, it is not true that a process W satisfying the properties (1), (3) and (4) has to be a Brownian motion. We can construct a counter-example as follows.
This construction is rather contrived, and I don't know if there's any simple examples.
Start with a standard Brownian motion W. The idea is to apply a small bump to its distribution while retaining the required properties. I will do this by first reducing it to the discrete-time case. So, choose a finite sequence of times 0 = t0 < t1 < ... < tn. Then define a piecewise linear process X by Xtk = Wtk (k = 0,1,...,n) and such that X is linearly interpolated across each of the intervals [tk-1,tk] and constant over [tn,∞).
Then, Y = W - X is a continuous process independent from X. In fact, Y is just a sequence of Brownian bridges across the intervals [tk-1,tk] and is a standard Brownian motion on [tn,∞). Also by linear interpolation, for any time t ≥ 0, Xt is a linear combination of at most two of the random variables Xt1,...,Xtn. The increments of W,
$$
W_t-W_s = X_t-X_s + Y_t-Y_s,
$$
are then a linear combination of at most 4 of the random variables Xt1,...,Xtn plus an independent term. So, choosing n ≥ 5, if it is possible to replace (Xt1,...,Xtn) by any other ℝn-valued random variable without changing the joint-distribution of any 4 elements, then the distributions of the increments Wt - Ws will be left unchanged. So, properties (1), (3), (4) will still be satisfied but the new process for W will not be a standard Brownian motion. It is possible to change the distribution in this way:
Let X = (X1,X2,...,Xn) be an ℝn-valued random variable with a continuous and strictly positive probability density pX: ℝn → ℝ. Then, there exists a random variable Y = (Y1,Y2,...,Yn) with a different distribution than X but for which the projection onto any n - 1 elements has the same distribution as for X.
That is, for any k1,k2,...,kn-1 in {1,...,n}, (Yk1,Yk2,...,Ykn-1) has the same distribution as (Xk1,Xk2,...,Xkn-1).
We can construct the probability density pY of Y by applying a bump to the probability distribution of X,
$$
p_Y(x)=p_X(x)+\epsilon f(x_1)f(x_2)\cdots f(x_n).
$$
Here, ε is a fixed real number and f: ℝ → ℝ is a continuous function of compact support and zero integral, $\int_{-\infty}^\infty f(x)\,dx=0$. Then, $\int_{-\infty}^\infty p_Y(x)\,dx_k=\int_{-\infty}^\infty p_X(x)\,dx_k$ for each k. So, the integral of pY over ℝn is 1 and, by choosing ε small, pY will be positive. Then it is a valid probability density function. Finally, as the integral along the kth direction (any k) agrees for pX and pY, the projection of X and Y onto ℝn-1 along the kth direction give the same distribution.
You cannot define a Lévy process by the individual distributions of its increments, except in the trivial case of a deterministic process Xt − X0 = bt with constant b. In fact, you can't identify it by the n-dimensional marginals for any n.
1) Let X be a nondeterministic Lévy process with X0 = 0 and n be any positive integer. Then, there is a cadlag process Y with a different distribution to X, but such that (Yt1,Yt2,…,Ytn) has the same distribution as (Xt1,Xt2,…,Xtn) for all times t1,t2,…,tn.
Taking n = 2 will give a process whose increments have the same distribution as for X.
The idea (as in my answer to this related question) is to reduce it to the finite-time case. So, fix a set of times 0 = t0 < t1 < t2 < … < tm for some m > 1.
We can look at the distribution of X conditioned on the ℝm-valued random variable U ≡ (Xt1,Xt2,…,Xtm). By the Markov property, it will consist of a set of independent processes on the intervals [tk−1,tk] and [tm,∞), where the distribution of {Xt }t ∈[tk−1,tk] only depends on (Xtk−1,Xtk) and the distribution of {Xt }t ∈[tm,∞) only depends on Xtm. By the disintegration theorem, the process X can be built by first constructing the random variable U, then constructing X to have the correct probabilities conditional on U. Doing this, the distribution of X at any one time only depends on the values of at most two elements of U (corresponding to Xtk−1,Xtk). The distribution of X at any set of n times depends on the values of at most 2n values of U.
Choosing m > 2n, the idea is to replace U by a differently distributed ℝm-valued random variable for which any 2n elements still have the same distribution as for U. We can apply a small bump to the distribution of U in such a way that the m − 1 dimensional marginals are unchanged. To do this, we can use the following.
2) Let U be an ℝm-valued random variable with probability measure μ. Suppose that there exist (non-trival) measures μ1,μ2,…,μm on the reals such that μ1(A1)μ2(A2)…μm(Am) ≤ μ(A1×A2×…×Am) for all Borel subsets A1,A2,…,Am ⊆ ℝ.
Then, there is an ℝm-valued random variable V with a different distribution to U, but with the same m − 1 dimensional marginal distributions.
By 'non-trivial' I mean that μk is a non-zero measure and does not consist of a single atom.
By changing the distribution of U in this way, we construct a new cadlag process with a different distribution to X, but with the same n dimensional marginals.
Proving (2) is easy enough. As μk are non-trivial, there will be measurable functions ƒk on the reals, uniformly bounded by 1 and such that μk(ƒk) = 0 and μk(|ƒk|) > 0. Replacing μk by the signed measure ƒk·μk, we can assume that μk(ℝ) = 0.
Then
$$
\mu_V = \mu + \mu_1\times\mu_2\times\cdots\times\mu_m
$$
is a probability measure different from μ. Choosing V with this distribution gives
$$
{\mathbb E}[f(V)]=\mu_V(f)=\mu(f)={\mathbb E}[f(U)]
$$
for any function ƒ: ℝm → ℝ+ independent of one of the dimensions. So, V has the same m − 1 dimensional marginals as U.
To apply (2) to U = (Xt1,Xt2,…,Xtm), consider the following cases.
X is continuous. In this case, X is just a Brownian motion (up to multiplication by a constant and addition of a constant drift). So, U is joint-normal with nondegenerate covariance matrix. Its probability density is continuous and strictly positive so, in (2), we can take μk to be a multiple of the uniform measure on [0,1].
X is a Poisson process. In this case, we can take μk to be a multiple of the (discrete) uniform distribution on {2k,2k + 1} and, as X can take any increasing nonnegative integer-valued path on the times tk, this satisfies the hypothesis of (2).
If X is any non-continuous Lévy process, case 2 can be used to change the distribution of its jump times without affecting the n dimensional marginals: Let ν be its jump measure, and A be a Borel set such that ν(A) is finite and nonzero. Then, X decomposes as the sum of its jumps in A (which occur according to a Poisson process of rate ν(A)) and an independent Lévy process. In this way, we can reduce to the case where X is a Lévy process whose jumps occur at a finite rate, with arrival times given by a Poisson process.
In that case, let Nt be the Poisson process counting the number of jumps in intervals [0,t]. Also, let Zk be the k'th jump of X. Then, N and the Zk are all independent and,
$$
X_t=\sum_{k=1}^{N_t}Z_k.
$$
As above, the Poisson process N can be replaced by a differently distributed cadlag process which has the same n dimensional marginals. This will not affect the n dimensional marginals of X but, as its jump times no longer occur according to a Poisson process, X will no longer be a Lévy process.
Best Answer
This is not hard to find such an example. Let $P$ be Wiener measure on the space $\Omega = C([0,\infty))$ of continuous functions $t\mapsto \omega(t)$. Then the process $\omega(t)$ satisfies all three conditions of a Brownian motion.
Now let's define a new process $W(t)$ that is "almost" equal to $\omega(t)$, but where we deliberately wreck the sample path continuity.
Take any random time $T:\Omega\to [0,\infty)$ that has a continuous distribution on $(\Omega, P)$, and let $W(t,\omega)=\omega(t)$ when $t\not=T(\omega)$, but $W(t,\omega)=\omega(t)+1$ otherwise. The process $W(t)$ still satisfies 1 and 3 but the sample path continuity fails at exactly at the time point $T(\omega)$ for each $\omega$.
There are many such random times $T$, for example you could use $T(\omega):=\inf [t>0: \omega(t)=1 ]$, i.e. the hitting time of 1.