Define the vector $y=(Ls-\theta)$, then using this vector and the Frobenius product, rewrite the function and find its differential
$$\eqalign{
f &= D:yy^T + \lambda L:L \cr
df &= D:2\,{\rm sym}(dy\,y^T) + 2\,\lambda L:dL \cr
&= 2\,D:dy\,y^T + 2\,\lambda L:dL \cr
&= 2\,Dy:dy + 2\,\lambda L:dL \cr
&= 2\,Dy:dL\,s + 2\,\lambda L:dL \cr
&= 2\,(Dys^T + \lambda L):dL \cr
}$$
Since $df=\frac{\partial f}{\partial L}:dL,\,$ the gradient is
$$\eqalign{
\frac{\partial f}{\partial L} &= 2\,(Dys^T + \lambda L) \cr
}$$
Setting the gradient to zero yields
$$\eqalign{
\lambda L &= D(\theta-Ls)s^T \cr
D\theta s^T &= \lambda L+DLss^T \cr
}$$
At this point, we can't solve for $L$ because it's sandwiched between 2 matrices. So we use a Kronecker-vec trick to solve for $L$ in vector form.
Applying vec to both sides of the above equation yields
$$\eqalign{
(\lambda I + ss^T\otimes D)\,{\rm vec}(L) &= {\rm vec}(D\theta s^T)\cr
}$$
This is a normal matrix-vector equation which can be solved for ${\rm vec}(L)$. The matrix $L$ can be recovered by un-stacking the rows of the solution vector.
Short answer: the derivative of $Q\otimes Q\otimes Q\otimes Q$ with respect to $Q$ is a mess, at first sight...
Let's start simple. Let $Q$ be a $K\times K$ matrix with entries $Q_{ij}$ and let $E^{ab}$ be the $K\times K$ matrix with all $0$ entries, except the entry $(a,b)$ which is $1$; in other words, $(E^{ab})_{ij} = \delta_a^i\delta_b^j$.
Then I claim that
$$
\frac{\partial(Q\otimes Q)}{\partial Q_{ij}} = E^{ij}\otimes Q+Q\otimes E^{ij} .
$$
I leave it to you to see why, because trying to write out the involved matrices will probably crash the entire Stack Exchange network...
Jokes aside, this is really immediate to see: just write $Q\times Q$ as in the first formula of the definition and think which elements are affected by $Q_{ij}$. There is the entire $(i,j)$th block, so you get $E^{ij}\otimes Q$, but there is also the $(i,j)$th entry in each block, which gives you $Q\otimes E^{ij}$.
Now, if $A$ and $B$ are matrices which are functions of $Q$, by the same reasoning you get
$$
\frac{\partial(A\otimes B)}{\partial Q_{ij}} = \frac{\partial A}{\partial Q_{ij}}\otimes B + A\otimes \frac{\partial B}{\partial Q_{ij}} .
$$
So you can iterate for instance
$$
\begin{split}
\frac{\partial (Q\otimes Q\otimes Q)}{\partial Q_{ij}}
&= \frac{\partial(Q\otimes Q)}{\partial Q_{ij}}\otimes Q
+ (Q\otimes Q)\otimes \frac{\partial Q}{\partial Q_{ij}} \\
&= (E^{ij}\otimes Q+Q\otimes E^{ij})\otimes Q + (Q\otimes Q)\otimes E^{ij} \\
&= E^{ij}\otimes Q\otimes Q + Q\otimes E^{ij}\otimes Q + Q\otimes Q\otimes E^{ij}.
\end{split}
$$
Now you can prove by induction that
$$
\frac{\partial \bigl(\bigotimes_{n=1}^N Q\bigr)}{\partial Q_{ij}}
= \sum_{n=1}^N \left(\bigotimes_{h=1}^{n-1} Q\right) \otimes E^{ij} \otimes \left(\bigotimes_{h=n+1}^{N} Q\right).
$$
Written more concisely,
$$
\frac{\partial Q^{\otimes N}}{\partial Q_{ij}}
= \sum_{n=1}^N Q^{\otimes (n-1)}\otimes E^{ij} \otimes Q^{\otimes (N-n)} .
$$
Best Answer
Yes, I think this is true. It of course depends on how you order your basis of $\Bbb{R}^n \otimes \Bbb{R}^n$ to identify it with $\Bbb{R}^{n^2}$, and likewise, how you choose a basis of $\Bbb{R}^{n^2} \otimes \Bbb{R}^n$ to identify it with $n^2$-by-$n$ matrices.
I will assume that if $e_1,\dots,e_n$ is a basis of $\Bbb{R}^n$, then you are ordering your basis of $\Bbb{R}^n \otimes \Bbb{R}^n$ as $e_1 \otimes e_1$, $e_1 \otimes e_2$, $\dots$, $e_1 \otimes e_n$, $e_2 \otimes e_1, \dots, e_n \otimes e_1, \dots, e_n \otimes e_n$. So when you write $D(f \otimes g)$ as a matrix, I will assume the rows are ordered according to this basis.
So the entry of $D(f \otimes g)$ in the row indexed by $e_i \otimes e_j$ and column indexed by $e_k$ will be $$ \frac{\partial f_i}{\partial x_k} g_j + f_i \frac{\partial g_j}{\partial x_k} $$ Every entry of the matrix looks like this, so $D(f \otimes g)$ is the sum of two matrices. Look for now just at the left term $\frac{\partial f_i}{\partial x_k} g_j$. You can see these are the entries of $D(f) \otimes g$. Similarly, the right term are the entries of $f \otimes D(g)$.
As an example when $n=2$, let $f(x,y) = (f_1(x,y), f_2(x,y))$, and similarly for $g$. Then
$$ D(f) \otimes g = \left( \begin{array}{cc} \frac{\partial f_1}{\partial x} g_1 & \frac{\partial f_1}{\partial y} g_1 \\ \frac{\partial f_1}{\partial x} g_2 & \frac{\partial f_1}{\partial y} g_2 \\ \frac{\partial f_2}{\partial x} g_1 & \frac{\partial f_2}{\partial y} g_1 \\ \frac{\partial f_2}{\partial x} g_2 & \frac{\partial f_2}{\partial y} g_2 \end{array} \right) $$
And similarly, $$ f \otimes D(g) = \left( \begin{array}{cc} f_1 \frac{\partial g_1}{\partial x} & f_1 \frac{\partial g_1}{\partial y} \\ f_1 \frac{\partial g_2}{\partial x} & f_1 \frac{\partial g_2}{\partial y} \\ f_2 \frac{\partial g_1}{\partial x} & f_2 \frac{\partial g_1}{\partial y} \\ f_2 \frac{\partial g_2}{\partial x} & f_2 \frac{\partial g_2}{\partial y} \end{array} \right) $$
If you add them together, you get the matrix for $D(f \otimes g)$ if you order the coordinates so that $f \otimes g = (f_1g_1, f_1g_2, f_2g_1, f_2g_2)$.