The calculation is no longer tedious if you split it up into four cases. Since the Jacobi identity is trilinear we only need to check it for one of the following cases:
$(s_1,0),(s_2,0),(s_3,0)$, or $(s_1,0),(s_2,0),(0,x_3)$, or $(s_1,0),(0,x_2),(0,x_3)$ or $(0,x_1),(0,x_2),(0,x_3)$. The cases themselves are immediate, because they follow from the facts that either $S$ is a Lie algebra, or that $I$ is a Lie algebra, or that the
$\theta(s_i)$ are derivations, or that $\theta$ is a Lie algebra homomorphism.
(1) Yes, this is correct. One can cut the amount of work by $2/3$ if one observes that the Jacobian identity is skew-symmetric in $ijk$. If we denote the full alternation over these indices by $\textrm{Alt}$, the Jacobi identity is $$0 = \textrm{Alt} \,[x_i, [x_j, x_k]] ,$$
which gives
$$0 = \text{Alt} (a^l_{ij} a^m_{kl}) ;$$
optionally, we can write out the full alternation (and use antisymmetry of the lower indices of structure constants) to recover your formula.
(2) This is the strongest result possible, in the sense that (again, given the antisymmetry in the lower induces of the structure constants) this is equivalent to the Jacobi identity for any (equivalently, every) choice $(x_i)$ of basis.
(3) There isn't exactly an analogue of those statements---in both of those cases, one fixes the upper index $l$ of $a^l_{ij}$ and interprets the entries $a^l_{ij}$ as components of a matrix $A$, but in the Jacobi identity condition one contracts with this index, so it cannot be fixed in any sense.
Here are two ways we might see this a little more concretely, however:
(1) Using antisymmetry, we can rearrange the Jacobi identity as
$$[x, [y, z]] = [[x, y], z] + [y, [x, z]] .$$ Now, if we define for $x \in L$ the operator $\textrm{ad}_x : L \mapsto L$, $y \mapsto [x, y]$, we can rewrite this as
$$\textrm{ad}_x[y, z] = [\textrm{ad}_x(y), z] + [y, \textrm{ad}_x(z)] .$$ In other words, the Jacobi identity is just a Leibniz (product) rule for the operator $\textrm{ad}_x$.
(2) In the special case $L = \mathfrak{so}(3, \Bbb R) \cong \mathfrak{su}(2)$, we may identify the Lie bracket with the usual cross product $\times$ on $\Bbb R^3$. Then, the Jacobi identity translates to the possibly familiar identity
$${\bf x} \times ({\bf y} \times {\bf z}) + {\bf y} \times ({\bf z} \times {\bf x}) + {\bf z} \times ({\bf x} \times {\bf y}) = 0.$$
Best Answer
You should start with the multiplication in $L.$ If $[x,y]=0$ then $L$ is abelian and we are done. Otherwise we have $[x,y]=\alpha x + \beta y$ with not both coefficients zero. Let's assume $\alpha\neq 0.$ Then $L$ gets a new basis $x'=-\alpha^{-1}y, y'=x+\beta \alpha^{-1}y$ and we get $$ [x',y']=[-\alpha^{-1}y,x+\beta \alpha^{-1}y]=x+\beta \alpha^{-1}y=y' $$ Thus we may assume that $[x,y]=y.$ This should make your calculation a lot easier.