The identity you proved $$\det(\lambda \cdot I - B) = \det(S^{-1}(\lambda I - A)S) = \det(\lambda \cdot I - A)$$ is valid as an identity of formal polynomials, not just an equality for all values of $\lambda$.
This is because both sides are polynomials in all of the variables involved, with integer coefficients. You have already proved the identity in fields of characteristic zero. That means that both sides must be identical as formal polynomials. But that implies that over any commutative ring, both sides expand to the same formal polynomial, because the operations involved in expanding out the polynomial (taking the determinant followed by expanding using the commutative, associative, distributive properties) are valid in every commutative ring. The general principle is sometimes called the "principle of permanence of identities" (one reference for this is Artin's book Algebra, Section 12.3, p. 456-457):
To prove that an identity of formal polynomials with integer coefficients (in any number of variables) holds in every commutative ring, it is sufficient to prove it in a single field of characteristic zero.
("Characteristic zero" is important, because I can prove 2=0 in a field of characteristic 2, but that's not valid in every commutative ring.)
Here's an example of this type of argument for a simpler problem:
Prove that for 2 by 2 matrices $A$ and $B$ over a commutative ring $R$, $\det(A) \det(B) = \det(AB)$.
Written out in general form, the identity we want to prove is (the vertical bars denote determinant): $$\begin{vmatrix} a & b \\ c & d \end{vmatrix} \cdot \begin{vmatrix} e & f \\ g & h \end{vmatrix} = \begin{vmatrix} ae+bg & af+bh \\ ce+dg & cf+dh \end{vmatrix}$$
And taking the determinant gives the following form of the identity that we want to prove:
$$ (ad-bc)(eh-fg) = (ae+bg)(cf+dh) - (ce+dg)(af+bh) \tag{*}$$
Here we are treating $a,b,c,d,e,f,g,h$ as indeterminates, not as elements of $R$. It is clear that both sides of the proposed identity are polynomials with integer coefficients in $a,b,c,d,e,f,g,h$. So far, nothing has depended on $R$, because the matrix multiplication and determinant operations are the same no matter what (commutative) ring we're working in.
Now imagine we don't know anything else about matrices or determinants, and we're asked to prove identity (*). How would we do it? The obvious thing is to expand both sides of (*) and collect like terms. If we get the same expanded form on both sides, then we would have proven the identity. Furthermore, this proof would be valid over any commutative ring $R$, because the process of expanding and collecting terms is valid over any commutative ring. It only involves the commutative, associative, and distributive properties, and integer arithmetic, all of which are valid over every commutative ring.
Next, imagine that we don't want to do all this expansion, so we find some other method of proving this identity, but our proof is valid only over one particular field of characteristic zero (say, the complex numbers, where we could use something like topology which wouldn't be valid over a general $R$). Having proven this identity over $\mathbb{C}$, we go back and ask ourselves what would happen if we expand both sides of (*). Must we get the same polynomial on both sides? We must, because if we got two different polynomials on the two sides, then there would have to be some complex numbers we could substitute for $a,b,\ldots,h$ that would produce different values on the two sides. That would contradict the fact that we've proven the identity over $\mathbb{C}$. So, when we expand both sides, in fact the polynomials are the same on both sides, so as we concluded previously, the identity is valid over any commutative ring $R$.
So it suffices to check the identity over a single field of characteristic 0.
Best Answer
Nice question! In many cases, that distinction is irrelevant, but in some cases it matters. And, when it matters, you are not right: it is a polynomial, not a polynomial function. For instance, polynomials have degrees, whereas polynomial functions don't (for instance, over $\mathbb F_2$ the polynomial function $x\mapsto x^2+x$ is the null function, but the polynomial $x^2+x$ still has degree $2$, whereas the null polynomial still has degree $0$). And the degree of the characteristic polynomial of a $n\times n$ matrix is $n$.