I've seen in the literature the notation $C$ with some additional specifications for the contraction maps of all sorts, but the amount of decorations on the symbol $C$ varied depending on the context. See, e.g., A.Gray, Tubes, p.56, where these maps are used in the case of somewhat special tensors, and therefore the notation is simpler.
In general, there is a whole family of uniquely defined maps
$$
C^{(r,s)}_{p,q} \colon \otimes^{r}_{s} V \to \otimes^{r-1}_{s-1} V
$$
which are collectively called tensor contractions ($1 \le p \le r, 1 \le q \le s$).
These maps are uniquely characterized by making the following diagrams commutative:
$$
\require{AMScd}
\begin{CD}
\times^{r}_{s} V @> {P^{(r,s)}_{p,q}} >> \times^{r-1}_{s-1} V\\
@V{\otimes^{r}_{s}}VV @VV{\otimes^{r-1}_{s-1}}V \\
\otimes^{r}_{s} V @>{C^{(r,s)}_{p,q}}>> \otimes^{r-1}_{s-1} V
\end{CD}
$$
Explanations are in order.
Recall that the tensor products $\otimes^{r}_{s} V$ are equipped with the universal maps
$$
\otimes^{r}_{s} \colon \times^{r}_{s} V \to \otimes^{r}_{s} V
$$
where $\times^{r}_{s} V := ( \times^r V) \times (\times^s V^*)$.
Besides that, there is a canonical pairing $P$ between a vector space $V$ and its dual:
$$
P \colon V \times V^* \to \mathbb{R} \colon (v, \omega) \mapsto \omega(v)
$$
Notice that map $P$ is bilinear and can be extended to a family of multilinear maps
$$
P^{(r,s)}_{p,q} \colon \times^{r}_{s} V \to \times^{r-1}_{s-1} V
$$
by the formula:
$$
P^{(r,s)}_{p,q} (v_1, \dots, v_p, \dots, v_r, \omega_1, \dots, \omega_q, \dots, \omega_s) = \omega_q (v_p) (v_1, \dots, \widehat{v_p}, \dots, v_r, \omega_1, \dots, \widehat{\omega_q}, \dots, \omega_s)
$$
where a hat means omission.
Since maps $P^{(r,s)}_{p,q}$ are multilinear, the universal property of the maps $\otimes^{r}_{s}$ implies that there are uniquely defined maps
$$
\tilde{P}^{(r,s)}_{p,q} \colon \otimes^{r}_{s} V \to \times^{r-1}_{s-1} V
$$
and then the maps $C^{(r,s)}_{p,q}$ are given by
$$
C^{(r,s)}_{p,q} := \otimes^{r-1}_{s-1} \circ \tilde{P}^{(r,s)}_{p,q}
$$
I was able to answer the question for myself well enough. The answer is that yes, for a given tensor $R_{ijkl}$ It is possible to find another tensor $L_{ijmn}$ with the property that
$$
L_{ijmn}R_{ijkl} = \delta_{mk}\delta_{nl}
$$
The point is that $L_{ijmn}$ has $N^4$ components and the equation above stands for $N^4$ equations where $N$ is the length of the indices. We should then be able to solve the above $N^4$ equations for the $N^4$ components of $L_{ijmn}$ to find the desired tensor.
I implemented some code in Mathematica to convince myself which I'll post here for those curious.. I don't know the best way to input code here so apologies if it doesn't look great.
R = Table[RandomReal[], {i, 1, 6}, {j, 1, 6}, {k, 1, 6}, {l, 1, 6}];
L = Table[a[i, j, k, l], {i, 1, 6}, {j, 1, 6}, {k, 1, 6}, {l, 1, 6}];
K = Table[
KroneckerDelta[m, k]*KroneckerDelta[n, l], {k, 1, 6}, {l, 1,
6}, {m, 1, 6}, {n, 1, 6}];
Flatten[
Table[
Sum[L[[i, j, m, n]]*R[[i, j, k, l]], {i, 1, 6}, {j, 1, 6}] ==
K[[k, l, m, n]]
, {k, 1, 6}, {l, 1, 6}, {m, 1, 6}, {n, 1, 6}]];
s = Solve[Flatten[
Table[
Sum[L[[i, j, m, n]]*R[[i, j, k, l]], {i, 1, 6}, {j, 1, 6}] ==
K[[k, l, m, n]]
, {k, 1, 6}, {l, 1, 6}, {m, 1, 6}, {n, 1, 6}]]];
Chop[Table[
Sum[L[[i, j, m, n]]*R[[i, j, k, l]], {i, 1, 6}, {j, 1, 6}]
, {k, 1, 6}, {l, 1, 6}, {m, 1, 6}, {n, 1, 6}] /. s[[1]]]
The final line shows the result of multiplying the tensors $L$ and $R$ when the components of $L$ are replaced with the components found in the above line. The result is the double Kronecker delta tensor $K$ as desired.
Fortunately for my application the code can run in a few seconds for me (on my laptop) with tensors with $6^4 = 1296$ components.
I'm not sure what to call $L_{ijmn}$. I could call it $R^{-1}_{ijmn}$, but I can also pose another problem:
Find a tensor $G_{ikmj}$ with the property that
$$G_{ikmj}R_{ijkl} = \delta_{ml}$$
This new tensor could similarly be called $R^{-1}_{ikmj}$ and so could a number of other tensors with different combinations and permutations of indices. Therefore, for the time being I'll just continue to use unique names for these inverse tensors and explicitly state the relevant property in terms of the Kronecker deltas.
Best Answer
First, note the two conventions:
For example, matrixcalulus.org uses (2). I'll be using (1). So let $F$ be $mΓn$, then:
$$\begin{aligned} \frac{π F'F}{πF} &= \frac{\partial F'F}{\partial (F', F)} β \frac{\partial(F', F)}{\partial F} \\&= \begin{bmatrix} π_nβF' & F'βπ_n \end{bmatrix}β \begin{bmatrix} π_{m, n} \\ π_{m, n} \end{bmatrix} \\&= (π_nβF')β π_{m, n} + (F'βπ_n)β π_{m, n} \\&= π_{n, n}β (F'βπ_n) + π_{n, n}β (F'βπ_n) \\&= (π_{n, n} + π_{n, n})β (F'βπ_n) \end{aligned}$$
In particular the directional (GΓ’teaux-) derivative is given by:
$$ πf(F)β H = (π_{n, n} + π_{n, n})β (F'βπ_n)β H = (π_{n, n} + π_{n, n})β F'H = H'F + F'H $$
Which agrees with the direct way of computing it via $\frac{π½ f(F+Ξ΅H)}{π½Ξ΅}\big|_{Ξ΅=0}$.
But let me explain the details step by step:
Source Code Demo
In python, using https://github.com/google/jax for automatic jacobian computation.