According to the comments the only ugly part was the devectorization.
Consider the 3x3 matrix $${\bf X}=\left[\begin{array}{ccc}
1&2&3\\4&5&6\\7&8&9\end{array}\right]$$
It's lexical vectorization is:
$$\text{vec}({\bf X}) = \left[\begin{array}{ccccccccc}1&4&7&2&5&8&3&6&9\end{array}\right]^T$$
We consider that kind of vectorization here.
The Matlab / Octave expression:
kron([1,0,0]',eye(3))'*kron(R(:),[1,0,0])+
kron([0,1,0]',eye(3))'*kron(R(:),[0,1,0])+
kron([0,0,1]',eye(3))'*kron(R(:),[0,0,1])-R
evaluates to the zero matrix for several random matrices R, each new term "selecting" a new row.
Then a guess would be that
$${\bf R}=\sum_k({\bf v_k} \otimes {\bf I})^T(\text{vec}({\bf R})\otimes {\bf v_k}^T)$$
With $\bf v_k$ being the Dirac or selector vector:
$$({\bf v_k})_i = \cases{0, k \neq i\\1, k=i}$$
Feel free to try and simplify it if you want.
First some notation. Denote the trace/Frobenius product with a colon, i.e.
$$A:B = {\rm Tr}(A^TB)$$
a matrix with an uppercase letter, a vector with a lowercase letter, and a scalar with a Greek letter.
For typing convenience, define
the column vectors
$$\eqalign{
a &= A^T, \quad
c &= C^T, \quad
f &= F, \quad
x &= X \\
}$$
and
the matrices
$$\eqalign{
H &= B^T\big(E\odot af^T\big), \quad
K &= H\odot D \\
}$$
Rewrite the function in terms of these new variables.
$$\eqalign{
\gamma
&= a^T\big(B(xc^T\odot D)\odot E\big)f \\
&= a:\big(B(xc^T\odot D)\odot E\big)f \\
&= af^T:\big(B(xc^T\odot D)\odot E\big) \\
&= (E\odot af^T):B(xc^T\odot D) \\
&= H:(xc^T\odot D) \\
&= K:xc^T \\
&= Kc:x \\
}$$
Now it's a simple matter to find the differential and gradient.
$$\eqalign{
d\gamma &= Kc:dx \\
\frac{\partial \gamma}{\partial x} &= Kc \\
}$$
NB: The properties of the trace allow Frobenius products to be rearranged in a variety of ways.
$$\eqalign{
A:B &= A^T:B^T \\
A:BC &= AC^T:B \;=\; B^TA:C \\
}$$
Also, Hadamard and Frobenius products commute with themselves and each other.
$$\eqalign{
A:B &= B:A \\
A\odot B &= B\odot A \\
C:A\odot B &= C\odot A:B \\
}$$
Update
There was a question in the comments about the related vector-valued problem
$$y = A\big(B(xc^T\odot D)\odot E\big)f$$
Even for this modified problem, the chain rule remains impractical. The real difficulty with both problems is the presence of the Hadamard products $-$ they make things awkward.
Nonetheless, here is how to calculate the gradient of the modified problem.
First, define some new variables.
$$\eqalign{
C &= {\rm Diag}(c), \quad X = {\rm Diag}(x)\;
\implies\;B(xc^T\odot D) = B(XDC) \\
E &= \sum_k \sigma_ku_kv_k^T \quad {\rm \{SVD\}} \\
W_k &= {\rm Diag}(\sigma_ku_k), \; V_k = {\rm Diag}(v_k) \implies
E\odot Z = \sum_k W_k Z V_k \\
}$$
Then rewrite the function.
$$\eqalign{
y &= A(E\odot BXDC)\,f \\
&= \sum_k A(W_kBXDCV_k)f \\
&= \sum_k {\rm vec}\Big(AW_kB\quad{\rm Diag}(x)\quad DCV_kf\Big) \\
&= \sum_k {\rm vec}\Big(\alpha_k\,{\rm Diag}(x)\,\beta_k\Big) \\
&= Jx\\
}$$
where this result provides a closed-form expression for the $J$-matrix.
$$\eqalign{
J &= \sum_k (\beta_k^T\otimes {\tt 1})\odot({\tt 1}\otimes \alpha_k) \\
}$$
Having rewritten the problem in this form, the gradient (i.e. Jacobian) is trivial.
$$\eqalign{
\frac{\partial y}{\partial x} &= J \\
}$$
Best Answer
Let's denote the elementwise/Hadamard and inner/Frobenius products respectively as $$\eqalign{ A &= B\circ C \cr \alpha &= B:C = {\rm tr}(B^TC) \cr }$$ Recall that these products are commutative, and mutually commutative $$\eqalign{ A\circ B &= B\circ A \cr A:B &= B:A \cr A:B\circ C &= A\circ B:C \cr }$$ and that the matrix of all ones is the identity element for the Hadamard product. Note that the matrices $(A,B,C)$ must have the same shape for these products to make sense.
Your scalar function can be written as $$\eqalign{ y &= 1:f \cr &= 1:(Ax)\circ(Bx) \cr &= Ax:Bx \cr }$$ Whosee differential and gradient are $$\eqalign{ dy &= A\,dx:Bx + Ax:B\,dx \cr &= Bx:A\,dx + Ax:B\,dx \cr &= (A^TB + B^TA)\,x:dx \cr \cr \frac{\partial y}{\partial x} &= (A^TB + B^TA)\,x \cr\cr }$$ (Note that I've used juxtaposition rather than $*$ for the ordinary matrix product.)