Here's another approach. Consider $A \otimes {\bf 1}_m$, we will show that this matrix can always be brought to the block form
$$\left(
\begin{array}{cccc} A & 0 & \cdots & 0 \\
0 & A & \cdots & 0 \\
0 & \cdots & \cdots & A \\
\end{array}\right)
$$
To this end consider the matrix $A$ with components $(a_{ij})$ in some basis, say $\{u_i\}$ of a vector space $V$ with dimension $n$ over a ring $R$. Consider also the identity ${\bf 1}_m$ over the vector space $W$ of dimension $m$ over $R$ also. We will use the basis $\{ u_i \otimes e_a \}$ for the space $V \otimes W$, being $i,j=1,\cdots,n$ and $a,b=1, \cdots, m$. Let us further choose an ordering for the basis, this ordering will be
$$
\{u_1 \otimes e_1, u_1\otimes e_2, \cdots ,u_1 \otimes e_m, u_2 \otimes e_1, \cdots ,u_n \otimes e_m \}
$$
Let us look at the form of the operator $A \otimes {\bf 1}_m$ in this basis, we shall see that it is the block form given above. Consider the action:
$$
(A \otimes {\bf 1}_m)u_{i}\otimes e_a = Au_i \otimes {\bf 1}_m e_a = \sum_{j,b} a_{ij}\delta_{ab} u_j \otimes e_b
$$
This means that the matrix element in this basis is $(A \otimes {\bf 1}_m)_{ia,jb} = a_{ij}\delta_{ab}$. This is the block form we are aiming to get, for notice that this matrix element is only distinct of zero when $a=b$, that is along the diagonal of a $m \times m$ block matrix, and in each block one has the matrix $(a_{ij})$, which is the operator $A$ in the basis $\{u_i\}$. The determinant is independent of the basis chosen.
Now take the determinant of this block matrix, it is easy to show that this determinant is $\det(A)^m$. Finally, as you pointed out yourself write
$$
A \otimes B = (A \otimes {\bf 1})({\bf 1}\otimes B)
$$
and use $\det(MN) = \det M \cdot \det N$. This works with any ring.
Yep, we can say things. The two ingredients you need are the following:
Equipped with these things, you can find out more quantitative relationships between the old and new matrices.
Edit: The question has been updated to ask about relationships between the ratios of the determinants, which according to the matrix determinant lemma are the quantities
$$1+v^TA^{-1}v\quad \text{and}\quad 1+v^TB^{-1}v$$
The short general answer is no. Even though we have some relationships between the spectra of $A$ and $B$, we can't in general say much about $v^TA^{-1}v$ compared to $v^TB^{-1}v$, since it may happen that $v$ is well-aligned with an eigenvector of $A^{-1}$ of large eigenvalue, and an eigenvector of $B^{-1}$ of small eigenvalue.
An extreme example meeting all conditions in the question would be to take a positive definite matrix $A$ which has large variation in its spectrum, and make $B$ be a rotated copy of $A$, i.e. obtained by conjugating with an orthogonal transformation. Then it may well happen that for some $v$ the ratio of determinants is huge, while for others it's tiny.
Best Answer
First approach: If $\{\lambda_1,\dots,\lambda_m\}$ are the eigenvalues of $A$ and $\{\nu_1,\dots,\nu_n\}$ those of $B$, then the eigenvalues of $A\otimes B$ are $\lambda_j\cdot\mu_k,1\leq j\leq m,1\leq k\leq n$.
We assume that the respective dimensions of $A$ and $B$ are $m$ and $n$. If $v$ is an eigenvector of $A$ for $\lambda_k$ and $w$ of $B$ for $\mu_j$, consider $V$ the vector of size $mn$, defined by $$V=(v_1w_1,\dots,v_1w_n,v_2w_1,\dots,v_2w_n,\dots,v_mw_1,v_mw_n).$$ It's an eigenvector of $A\otimes B$ for the eigenvalue $\lambda_k\mu_j$. As the matrices $A$ and $B$ are diagonalizable, counting multiplicity we are sure there aren't other eigenvalues.
As $A$ and $B$ are positive definite, $\lambda_k\mu_j>0$ for all $k,j$.
Second approach: We use mix product property, that is $$(A_1A_2)\otimes (B_1B_2)=(A_1\otimes B_1)(A_2\otimes B_2).$$ Applied twice, this gives $$A\otimes B=(P_1^tD_1P_1)\otimes (P_2^tD_2P_2),$$ where $P_i$ are orthogonal and $D_i$ diagonal. This gives $$A\otimes B=(P_1\otimes P_2)^t(D_1\otimes D_2)(P_1\otimes P_2),$$ so the problem reduces to the case $A$ and $B$ diagonal, which is easy, as the eigenvalues are positive.