Let $K$ be a field. For $A \in \mathrm{M}(m \times n, K)$ let
$$
R_i(A) = \sum_{j=1}^n A_{ij} \quad \text{for every $1 \leq i \leq m$}
$$
and
$$
C_j(A) = \sum_{i=1}^m A_{ij} \quad \text{for every $1 \leq j \leq n$},
$$
and set
$$
V_{m,n}(K) = \{A \in \mathrm{M}(m \times n, K) \mid R_1(A) = \dotsb R_m(A) = C_1(A) = \dotsb = C_n(A)\}.
$$
We show that the map
\begin{align*}
\Phi \colon V_{m,n}(K) &\to \mathrm{M}((m-1) \times (n-1), K), \\
\quad (a_{ij})_{1 \leq i \leq n, 1 \leq j \leq m} &\mapsto (a_{ij})_{1 \leq i \leq n-1, 1 \leq j \leq m-1}
\end{align*}
is an isomorphism; it is clearly linear.
First surjectivity: Let $A = (a_{ij})_{1 \leq i \leq n-1, 1 \leq j \leq m-1} \in \mathrm{M}((m-1) \times (n-1), K)$. For all $1 \leq i \leq m-1$ let $a_{in} = -R_i(A)$ and for all $1 \leq j \leq n-1$ let $a_{mj} = -C_j(A)$. Also let
$$
a_{mn}
= \sum_{\substack{1 \leq i \leq m-1 \\ 1 \leq j \leq n-1}} a_{ij}.
$$
For $\hat{A} = (a_{ij})_{1 \leq i \leq n, 1 \leq j \leq m} \in \mathrm{M}(m \times n, K)$ we have that
$$
R_i(\hat{A}) = \sum_{j=1}^n a_{ij} = R_i(A) + a_{in} = 0
\quad \text{for every $1 \leq i \leq m-1$}
$$
as well as
\begin{align*}
R_m(\hat{A})
&= \sum_{j=1}^n a_{mj}
= \sum_{j=1}^{n-1} a_{mj} + a_{mn} \\
&= -\sum_{j=1}^{n-1} C_j(A) + a_{mn}
= -\sum_{j=1}^{n-1} \sum_{i=1}^{m-1} a_{ij} + \sum_{\substack{1 \leq i \leq m-1 \\ 1 \leq j \leq n-1}} a_{ij}
= 0.
\end{align*}
So all row sums of $\hat{A}$ are zero. Simililarly we find that all column sums of $\hat{A}$ are zero. So $\hat{A} \in V_{m,n}(K)$. Because $\Phi(\hat{A}) = A$ this shows the surjectivity of $\Phi$.
For the injecitvity we argue the other way around: For every $A \in V_{m,n}(K)$ we have $A_{in} = -R_i(\Phi(A))$ for every $1 \leq i \leq m-1$ and $A_{mj} = -C_j(\Phi(A))$ for every $1 \leq j \leq n-1$, as well as
$$
A_{mn}
= -\sum_{j=1}^{n-1} A_{mj}
= \sum_{j=1}^{n-1} C_j(\Phi(A)),
$$
So $A$ is uniquely determined by $\Phi(A)$, showing that $\Phi$ in injective.
Best Answer
Assume $B$ is in Jordan normal form (real, the complex case is left as exercise). Assume the block $J_i = \lambda_i I + S$ is $n_i \times n_i$ for $1\le i \le p$, where $I$ is the identity and $S$ the shift matrix of suitable size. Then, the space of matrices which commutes with $B$ has dimension $$ \sum_{\substack{i,j=1 \\ \lambda_i = \lambda_j}}^p \min(n_i, n_j). $$
(I hope I found all cases.)
Notes and Proof(outline):
For $k\in\mathbb N$ let $I_k$ denote the $k\times k$ identity matrix and let $S_k$ denote the $k\times k$ shift matrix.
Notice that the operator $C_B(A) = AB - BA$ can be written as matrix vector product using the Kronecker product and the vectorization operator $\operatorname{vec}$: $$ \operatorname{vec} C_B(A) = ((B^T \otimes I) - (I \otimes B)) \operatorname{vec}(A). $$ So, for an explicitly given $B$ you can compute the rank easier.
Also notice that for $B = QJQ^{-1}$, where $Q$ is non-singular, we have $$ \ker C_B = Q\ker C_J Q^{-1}. $$ So we may assume Jordan normal form w.l.o.g.
Let $A\in\mathbb R^{m\times n}$. In case of $m \le n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} 0 & \bar A \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_m, S_m^2, \dotsc, S_m^{m-1} \}. $$ Similarly, in case of $m \ge n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} \bar A \\ 0 \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$
For $B = \lambda I_n + S_n\in\mathbb R^{n\times n}$, where $\lambda\in\mathbb R$, we have $$ \ker C_B = \ker C_S = \operatorname{span}\{ I_n, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$
Let $A = (A_{i,j})_{1\le i,j \le p}$ be a block matrix and $B = \operatorname{diag}(B_1,\dotsc, B_p)$ a block diagonal matrix w.r.t. the same partition and $B_i = \lambda_i I_{n_i} + S_{n_i}$. Then, we have $C_B(A) = 0$ if and only if $$ 0 = A_{i,j} B_j - B_i A_{i,j} = (\lambda_j - \lambda_i) A_{i,j} + (A_{i,j} S_{n_j} - S_{n_i} A_{i,j}) $$ for $1 \le i \le j \le p$. That is, if $A_{i,j}\ne 0$, then $-(\lambda_j - \lambda_i)$ is an eigenvalue of the operator $X\mapsto C_{S_{n_j}, S_{n_i}}(X) := XS_{n_j} - S_{n_i} X$. However, that operator has only the eigenvalue $0$.