Let $\sigma = \{P,(c_\alpha)_{\alpha<\omega_1},f\}$, and let $\sigma' = \{P,(c_\alpha)_{\alpha<\omega_1},(d_n)_{n<\omega}\}$. Here $P$ is a unary relation symbol, $f$ is a unary function symbol, and the $c_\alpha$ and $d_n$ are constant symbols. Then $\sigma\cap \sigma' = \{P,(c_\alpha)_{\alpha<\omega_1}\}$ Now define:
\begin{align*}\Psi &= \{P(c_\alpha)\land P(c_\beta)\land c_\alpha\neq c_\beta\mid \alpha<\beta<\omega_1\}\cup \{\exists x_1\dots\exists x_n(\bigwedge_{i=1}^n \lnot P(x_i)\land \bigwedge_{i\neq j} x_i\neq x_j)\mid n<\omega\}\\
\varphi &: (\forall x\, (P(x)\rightarrow \lnot P(f(x)))\land (\forall x\forall y\, (f(x) = f(y)\rightarrow x = y))\\
\varphi' &: \forall x\, (\lnot P(x)\rightarrow \bigvee_{n<\omega} (x = d_n))
\end{align*}
A model of $\Psi$ has $\omega_1$-many distinct elements named by constants and satisfying $P$ (as well as possibly other elements satisfying $P$), and infinitely many elements satisfying $\lnot P$. I claim that any two such models, say $M$ and $N$, are $\mathcal{L}_{\omega_1,\omega}(\sigma\cap \sigma')$-equivalent. Since any sentence of $\mathcal{L}_{\omega_1,\omega}$ only mentions countably many symbols, it suffices to show that for any countable signature $\sigma^*\subseteq (\sigma\cap \sigma')$, the reducts $M|_{\sigma^*}$ and $N|_{\sigma^*}$ are $\mathcal{L}_{\omega_1,\omega}(\sigma^*)$-equivalent. Now $M|_{\sigma^*}$ and $N|_{\sigma^*}$ consist of countably-many distinct elements named by constants and satisfying $P$, infinitely many other elements satisfying $P$, and infinitely many elements satisfying $\lnot P$. By the infinite Ehrenfeucht-Fraïssé game, $M|_{\sigma^*}$ and $N|_{\sigma^*}$ are $L_{\infty,\omega}(\sigma^*)$-equivalent.
A model of $\Psi\cup \{\varphi\}$ is a model $M$ of $\Psi$ together with an injective function $f\colon M\to M$ which maps $P$ into $\lnot P$. This is satisfiable, e.g. by taking $P$ to be $\omega_1$, with $c_\alpha = \alpha$, taking $\lnot P$ to be a disjoint set $X$ of size $\aleph_1$, and taking $f = g\cup g^{-1}$, where $g$ is a bijection $X\to \omega_1$.
A model of $\Psi\cup \{\varphi'\}$ is a model $M$ of $\Psi$ such that every element of $\lnot P$ is named by some constant $d_n$. This is satisfiable, e.g. by taking $P$ to be $\omega_1$, with $c_\alpha = \alpha$, taking $\lnot P$ to be a disjoint countably infinite set, each element of which is the interpretation of one of the constants $d_n$.
But $\Psi\cup \{\varphi,\varphi'\}$ is not satisfiable, because $\varphi$ forces $\lnot P$ to be uncountable, while $\varphi'$ forces $\lnot P$ to be countable.
You have to distinguish between what individual sentences can do and what theories can do. Theories, being arbitrary sets of sentences, are able to perform arbitrarily large conjunctions in the following sense:
For any set of theories $\{T_i: i\in I\}$, the theory $T:=\bigcup_{i\in I}T_i$ "behaves like the conjunction of the $T_i$s" in the following sense: the models of $T$ are exactly those structures satisfying each $T_i$.
On the other hand, even if each $T_i$ consists just of a single sentence $T_i=\{\varphi_i\}$, there may be no single sentence $\psi$ such that the models of $\psi$ are exactly the models of each $\varphi_i$. For a concrete example, there is no single sentence true in exactly the infinite structures, even though for each $n$ there is a sentence true in exactly the structures of size $\ge n$.
Best Answer
The intuition behind Scott sentences is that they give recipes for back-and-forth arguments. An excellent exposition is given in Marker's book Model Theory: An Introduction, Section 2.4. The section starts with Cantor's back-and-forth proof that any two countable dense linear orders without endpoints are isomorphic, introduces Ehrenfeucht–Fraïssé games, and builds up to Scott's isomorphism theorem.
Your recent questions indicate that you're interested in the concept of $\omega$-homogeneity, which is also closely related to the back-and-forth idea. I think you'll learn a lot more from reading Marker's Section 2.4 thoroughly than anything I could write in this answer. But I'll try to give the basic idea.
In Knight's presentation, we start with "complete" formulas $\varphi_{\overline{a}}(\overline{x})$, which you should think of as completely describing the behavior of the tuple $\overline{a}$. More on how to get these formulas below. But first, let's imagine we have structures $M$ and $N$ which satisfy all the same sentences $\rho_{\overline{a}}$. We want to show $M\cong N$ by a back-and-forth argument.
Ok, we want to build up an isomorphism between the two element-by-element. Pick some $a_1\in M$. $\rho_\varnothing$ gives a list of complete formulas in one variable and says "every element satisfies one of the formulas in this list, and every formula in the list is satisfied by some element". So $a_1$ satisfies some complete formula $\varphi_1(x)$ on the list, and that same complete formula is satisfied by some element $b_1\in N$. We start our isomorphism by mapping $a_1\mapsto b_1$.
Next, we pick an element $b_2\in N$. The sentence $\rho_{b_1}$ gives a list of complete formulas in two variables which extend $\varphi_1(x)$ and says "if $x$ satisfies $\varphi_1(x)$, then every pair $xy$ extending $x$ satisfies one of the formulas in this list, and every formula in the list is sastified by some pair extending $x$." So $b_1,b_2$ satisfies some complete formula $\varphi_2(x,y)$ on the list, and there is some $a_2\in M$ such that $a_1,a_2$ satisfies that same complete formula. We extend our isomorphism by mapping $a_2\mapsto b_2$.
Continuing this way (back and forth) to handle all of the elements in some enumerations of the countable structures $M$ and $N$, we arrive at an isomorphism.
Now the question remains how to come up with the "complete" formulas $\varphi_{\overline{a}}$, for $\overline{a}$ in $M$. As a first approximation, we can take the conjunction of all atomic and negated atomic formulas satisfied by $\overline{a}$. This captures the quantifier-free type of $\overline{a}$. But knowing the quantifier-free type of $\overline{a}$ does not tell us the possible quantifier-free types of extensions $\overline{a}b$ of the tuple. So we consider formulas of the form $$(\forall y\, \bigvee \psi_i(x,y)) \land (\bigwedge \exists y\, \psi_i(x,y))$$ where the $\psi_i$ are complete quantifier-free formulas. Let's call these formulas $1$-extension formulas. Now we can take the conjunction of all $1$-extension formulas satisfied by a tuple. That tells us about all of the complete quantifier-free types of extensions of the tuple, but not yet about all of the complete $1$-extension types of extensions of the tuple!
In other words, $1$-extension formulas let us start with a partial isomorphism between finite tuples and extend by one more element - but no more! That's not enough to build a total isomorphism by back-and-forth.
So this leads us to define $2$-extension formulas, which look just like $1$-extension formulas, but now the $\psi_i$ are conjunctions of $1$-extension formulas. Satisfying the same $2$-extension formulas means that we can extend a back-and-forth argument by two steps, but no more. Continuing, we can define $n$-extension formulas for all finite $n$. Satisfying the same $n$-extension formulas for all finite $n$ ensures we can continue a back-and-forth argument any finite number of steps, but not necessarily countably many steps.
Scott's idea was to iterate this construction through the countable, defining $\alpha$-extension formulas for all countable ordinals $\alpha$. Now if we assume that $M$ is a countable structure, it has only countably many tuples, so it won't take all $\aleph_1$-many countable ordinals to distinguish all the tuples in $M$. In other words, there is some countable ordinal $\alpha$ (the Scott rank of $M$) such that when trying to understand the behavior of tuples from $M$, it suffices to consider all the $\alpha$-extension formulas satisfied by the tuple. The countable conjunction of all of these are the "complete formulas" referred to in the explanation of the Scott sentence above, and they allow back-and-forth arguments to go "all the way up".