In $2$ dimensional case, we have $[x,y]=0$ or $[x,y]=z=ax+by$
if $a$ is zero then by changing the variables you get what you looking for but if $a$ was not zero then divide both sides by $a$ so that it becomes $$[x,y/a]=x+by/a$$
now change the $x+by/a$ variable to $z$.
$$[ z-by/a,y/a]=[z,y/a]=z$$ then change $y/a$ to $u$ so that you get $[z,u]=z$
(1) Yes, this is correct. One can cut the amount of work by $2/3$ if one observes that the Jacobian identity is skew-symmetric in $ijk$. If we denote the full alternation over these indices by $\textrm{Alt}$, the Jacobi identity is $$0 = \textrm{Alt} \,[x_i, [x_j, x_k]] ,$$
which gives
$$0 = \text{Alt} (a^l_{ij} a^m_{kl}) ;$$
optionally, we can write out the full alternation (and use antisymmetry of the lower indices of structure constants) to recover your formula.
(2) This is the strongest result possible, in the sense that (again, given the antisymmetry in the lower induces of the structure constants) this is equivalent to the Jacobi identity for any (equivalently, every) choice $(x_i)$ of basis.
(3) There isn't exactly an analogue of those statements---in both of those cases, one fixes the upper index $l$ of $a^l_{ij}$ and interprets the entries $a^l_{ij}$ as components of a matrix $A$, but in the Jacobi identity condition one contracts with this index, so it cannot be fixed in any sense.
Here are two ways we might see this a little more concretely, however:
(1) Using antisymmetry, we can rearrange the Jacobi identity as
$$[x, [y, z]] = [[x, y], z] + [y, [x, z]] .$$ Now, if we define for $x \in L$ the operator $\textrm{ad}_x : L \mapsto L$, $y \mapsto [x, y]$, we can rewrite this as
$$\textrm{ad}_x[y, z] = [\textrm{ad}_x(y), z] + [y, \textrm{ad}_x(z)] .$$ In other words, the Jacobi identity is just a Leibniz (product) rule for the operator $\textrm{ad}_x$.
(2) In the special case $L = \mathfrak{so}(3, \Bbb R) \cong \mathfrak{su}(2)$, we may identify the Lie bracket with the usual cross product $\times$ on $\Bbb R^3$. Then, the Jacobi identity translates to the possibly familiar identity
$${\bf x} \times ({\bf y} \times {\bf z}) + {\bf y} \times ({\bf z} \times {\bf x}) + {\bf z} \times ({\bf x} \times {\bf y}) = 0.$$
Best Answer
Given a vector space basis $(e_1,\ldots ,e_n)$ and Lie brackets $$ [e_i,e_j]=\sum_{r=1}^n c_{ij}^r e_r, $$ the Jacobi identity is equivalent to the system of polynomial equations $$ \sum_{r=1}^n (c_{ij}^r c_{lr}^s+c_{jk}^r c_{ir}^s+c_{ki}^r c_{jr}^s)=0 $$ for all $1\le i<j<k\le n,\; 1\le s\le n$. In general, we have to verify all these equations. Of course, there are many results, where you already know that the Jacobi identity must hold, e.g., if you construct your Lie algebra as a semidirect product of two other Lie algebras, or if your Lie brackets are given by commutator $[A,B]=AB-BA$ of matrices, which always satisfies the Jacobi identity.