does * holds?
The short answer is that $E[T_3 \mid T_1+T_2\le 1,\, T_1+T_2+T_3>1]\overset{*}{=}E[T_3 \mid T_3 > 1-T_1-T_2]$ is true but not helpful in the way you'd like.
You might think that by definition $T_3 > 0$ such that the new expression of conditioning on the right hand side $T_3 > 1-T_1-T_2$ automatically takes care of $T_1$ and $T_2$.
Unfortunately, while the equal sign $\overset{*}{=}$ is technically true in terms of the symbolic algebraic manipulation, when it comes to calculation you still have to make the implicit $T_1 + T_2 < 1$ explicit.
The next equal sign $E[T_3 \mid T_3>1-T_1-T_2]\overset{\text{memoryless}}{=}1-T_1-T_2+E[T_3]$ is sloppy at best, and it should be interpreted as false most of the time.
If we are talking about $t_1$ and $t_2$ as two constants, then $E[T_3 \mid T_3 > 1 -t_1 - t_2] = 1 - t_1 - t_2 + E[T_3]$ is true.
Alternatively, if we are taking the expectation with respect to only $T_3$ then
$E_{T_3}[T_3 \mid T_3>1-T_1-T_2] = 1 - T_1 - T_2 + E[T_3]$ is also true, while the right hand side is still a function of the random variables $T_1$ and $T_2$. One still has to further take the expectation $E_{T_1,T_2}[\text{blah}]$.
how to make these two result consistent?
The decomposition $E(S_{4} \mid N(1)=2) = \ldots =A+B+C$ is correct and one indeed can "solve it directly". It's just that one has to actually do some calculations for $B$ (or quote additional known results like what's done for $A$ and $C$).
The $A = \frac23$ and $C = \frac1{\lambda}$ are correct, and your hunch that "$T_3$ is not independent of $N(1) = 2$" is also correct.
Following your notation that $S_2 = T_1 + T_2$, I shall use the dummy variable $s$ for $S_2$ and $t$ for $t_3$ in their respective marginal densities $f_{S_2}(s)$ and $f_{T_3}(t)$. We know that $f_{S_2}$ is a Gamma distribution with shape parameter $2$ and $f_{T_3}$ is exponential.
For both generality and clarity, I shall use $\tau$ to denote the given cutoff time length, as in here we have $\tau = 1$ and we are conditioning on $N(\tau) = 2$.
$$B := E\bigr( T_3 ~\big|~ N(\tau) = 2 \bigr) = \frac{ \iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s }{ \Pr\{ N(\tau) = 2 \} } $$
where the joint density $f_{S_2,T_3}$ is the direct product of the marginals $f_{S_2}$ and $f_{T_3}$ because all $T_i$ are mutually independent. The 2-dim integral is over the region $\Omega = \{ (t,s): 0 < s < \tau,~t > \tau - s \}$. Lastly, the denominator $\Pr\{ N(\tau) = 2 \}$ is just familiar probability mass function of the discrete Poisson distribution
$$\Pr\{ N(\tau) = k \} = e^{ -\lambda \tau } \frac{ (\lambda \tau)^k }{ k! } $$
Evaluating the integral is just routine work (but by no means allowing any shortcut without actual calculation).
\begin{align}
\iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s &= \int_{s = 0}^{ \tau } \int_{t = \tau - s}^{\infty} t \cdot f_{S_2}(s) \cdot f_{T_3}(t)\,\mathrm{d}t \,\mathrm{d}s \\
&= \int_{s = 0}^{ \tau } \int_{t = \tau - s}^{\infty} t \cdot \frac{ \lambda }{ \Gamma(2) } (\lambda s)^{2-1} e^{-\lambda s} \cdot \lambda e^{-\lambda t} \,\mathrm{d}t \,\mathrm{d}s \\
&= \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s e^{-\lambda s} \left[ \int_{t = \tau - s}^{\infty} t \cdot \lambda e^{-\lambda t} \,\mathrm{d}t \right] \,\mathrm{d}s \\
&= \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s e^{-\lambda s} \left[ \left( \tau - s + \frac1{ \lambda} \right) e^{-\lambda (\tau - s)} \,\mathrm{d}t \right] \,\mathrm{d}s \\
&= e^{ -\lambda \tau } \int_{s = 0}^{ \tau } \frac{ \lambda }{ \Gamma(2) } \lambda s \left( \bigl(\tau + \frac1{ \lambda} \bigr) - s\right)\,\mathrm{d}s \\
&= e^{ -\lambda \tau } \left( \bigl(\tau + \frac1{ \lambda} \bigr) \frac{ (\lambda \tau)^2 }2 - \frac1{\lambda} \frac{ (\lambda \tau)^3 }3 \right) \qquad \text{, then with}~\tau = 1 \\
& = e^{-\lambda} \frac{ \lambda^2}2 \left( 1 + \frac1{ \lambda} - \frac23 \right)
\end{align}
Put things back together
$$B = \frac{ \iint_{\Omega} t \cdot f_{S_2,T_3}(s,t) \,\mathrm{d}t \,\mathrm{d}s }{ \Pr\{ N(\tau) = 2 \} } = \frac{ e^{-\lambda} \frac{ \lambda^2}2 \left( 1 + \frac1{ \lambda} - \frac23 \right) }{ e^{-\lambda} \frac{ \lambda^2}2 } = 1 + \frac1{ \lambda} - \frac23 \\
E[T_3] = A + B + C = \frac23 + \left( 1 + \frac1{ \lambda} - \frac23 \right) + \frac1{ \lambda} = 1 + \frac2{ \lambda}$$
So you see, the decomposition into $S_2 + T_3 + T_4$ is not a "natural cutting". Conditioning on $N(1) = 2$ we have a $T_3$ that really should be split into two parts (as remarked by Julien Berestycki in his answer), one part goes together nicely with $S_2$ and the other with $T_4$.
Best Answer
Some general comments:
First of all let's clear up some apparent confusion. The process $(T_n)_{n \geq 1}$ you (incorrectly) refer to as a Poisson process is really the arrival times (or jump times) of the $n$-th arrival for $n\geq 1$. A Poisson process is a counting process indexed in continuous time, $(N_t)_{t\geq 0}$ and takes (non-negative) integer values. The process $(T_n)$ is instead indexed in discrete time $(n=1,2,\dotsc)$ and takes continuous values in $(0,\infty)$. They are related by the formula, $\mathbb{P}(N_t \geq n)=\mathbb{P}(T_n \leq t)$, or in English, the probability of having greater than $n$ arrivals by time $t$ is equal to the probability that the time of the $n$-th arrival is less than $t$.
Now, you correctly identify the distribution of $T_n \sim \Gamma(n,\lambda)$. Recall, the inter-arrival times of a Poisson process form an IID sequence of exponential RVs with rate $\lambda$, call it $(W_i)_{i \geq 1}$. The arrival times are related by $T_n=W_1+\dotsc+W_n$. The inter-arrival times are independent of each other by the memoryless property of exponential distribution. The arrival times are not—the arrival time of the $(n+1)$-th arrival depends on how long it took to see the $n$-th, $T_{n+1}=W_{n+1}+T_n$, no?
So, your claim that $T_1+\dotso+T_n$ is $\Gamma(n(n+1)/2,\lambda)$ is false since this only holds for independent $(T_n)$. Note, that $T_{n}$ and $T_{n+m}-T_{n}$ are independent for $n,m$ and the latter is distributed like $T_{m}$. So the joint distribution of $(T_n, T_{n+m})$ has density $f_{T_n}(t) \cdot f_{T_m}(s-t)\mathbb{1}_{0<t<s}$.
Sketch of solution:
Consider time at $t=1$. First note, $\{T_5 \leq 1 < T_6\}=\{N_1=5\}$. That is, conditional on $T_5 \leq 1$, we know that $N_1 \geq 5$, i.e. at least $5$ arrivals have occurred. Further, given that $T_6>1$, we know that $6$-th arrival has not occurred yet, i.e. $N_1<6$. Since $N_t$ is integer valued, there is only one choice—we are really conditioning on the event that $N_1=5$. A useful and well known fact (e.g. on wikipedia) is that conditional on the number of arrivals in a time interval for a given Poisson process, the jump times are the order statistics of uniformly distributed RVs. Now, just follow your nose :)