Consider the standard Ising model on $[0,N]^2$ for $N$ large. By that I mean the square-lattice Ising model without external field, inside an $N$-by-$N$ square. What is its entropy for $N$ large? It must behave asymptotically as $c(\beta)N^2$ for some constant $c(\beta)$ depending on the inverse temperature $\beta$. What is $c(\beta)$? Has it been computed?
[Math] Entropy of the Ising model
entropyising-modelpr.probabilitystatistical-physics
Related Solutions
There is a method to study these physical systems called belief propagation (BP), which yields exact methods for interaction graphs that are trees, and empirically works pretty fine when you have physical systems with loopy interaction graphs that locally-look-like trees. Randomly-generated sparse interaction graphs are of this form (its explicitly mentioned in the paper), and you could apply Belief Propagation to your model to obtain good estimates of the following quantities.
- Marginal probabilities $\sigma_i$.
- Joint probabilities $p(\sigma_i, \sigma_j)$, in particular, the correlators you mention.
- Conditional probabilities $p(\sigma_i | \sigma_j)$
- Partition functions, energies, free energies, entropies.
And, in addition, BP can be adapted to sample from the probability distribution $p(\sigma)$. In physics, systems like these are sometimes called 'diluted spin-glasses'.
I summarise the mean features of Belief propagation below. Mézard, Montanari's book contains a very pedagogical exposition of the method, also rather up-to-date and complete; you can read the draft of the book online without buying it. I personally used this book to prepare a seminar and I can recommend it. The connections between Belief Propagation and the ferromagnetic spin Isin chain are explained in Part D, section 14.1 of the online version.
Overview
Belief propagation is an iterative "message-passing" algorithm. For each site/spin/particle (vertexes of the graph) of your system, the belief propagation methods assigns incoming and leaving "messages" to its neighbouring edges (which represent interactions) in the interaction graph. These messages can be understood as "probability weights" that contribute to the marginal probability that the particle say on site $i$ is on a given local configuration $\sigma_i$, which I denote by $p(\sigma_i)$. The messages are updated through local computations (thus, not costly) for each particle, following a set of equations known as the belief-propagation update-rules. I will not write the update-rules here, for they are lengthy.
What it is important is the following: if on site $i$, with neighbours indexed by $a$, we denote the leaving/incoming messages on time $t$ by $p_{i\rightarrow a}^{t}(\sigma_i)$ and $p_{a\rightarrow i}^{t}(\sigma_i)$, then BP method gives a way to estimate the marginal probability $p(\sigma_i)$ as follows: $$p(\sigma_i)\simeq p^{t}(\sigma_i)=\prod_{a\in\partial_i}p_{a\rightarrow i}^{t-1}(\sigma_i)$$
For trees, this iteration converges to fix point in finite time $t_{*}$, and this bound has a known value. Moreover, the method provably computes the marginal probabilities exactly. In fact, for trees this method and transfer-matrix methods used e.g. for the classical Ising spin chain turn out to be equivalent.
If your system it is not a tree, you do not get exact results and the message-update iteration is not promised to converge. However, the method has been used with wide experimental success on graphs that locally-look-like-trees (this seems to set an upper bound to the correlation-distance that you can have) to provide good estimations of these marginal probabilities.
Finally, the method can be adapted to estimate all the other quantities I mentioned.
Drawbacks
The main disadvantage of this method is that it is rather complicated to find analytical bounds for errors or analytical conditions that guarantee convergence; both are fields of current research. For a recent study regarding convergence you can check [1]; regarding errors, I found this ones [2].
And, of course, another disadvantage is that if you add a lot of random interactions to your graph, at some point your interaction-graph stops being tree-like and the method would behave poorly. In fact, there is a threshold were a gigantic connected component appears in the interaction graph [3].
The relationship between the Ising model (spins on a lattice) and conformal field theory holds only in the immediate vicinity of the critical point, when correlation lengths go to infinity and all details on the scale of the lattice constant become irrelevant.
The relationship is explained, for example, in these lecture notes. Let me walk you through them.
The correspondence starts from the Ising model of classical spins on a two-dimensional lattice, which is equivalent to a one-dimensional model of quantum mechanical spin-$1/2$ degrees of freedom (Ising chain). In the lecture notes this mapping is derived from quantum to classical in chapter 2, with the inverse mapping shown in Appendix B. The space and time dimension of the quantum spin chain become the two spatial dimensions of the classical Ising model.
Chapter 3 of the lecture notes then explains how the quantum spin chain can be transformed into a quantum field theory of free spinless fermions in the continuum limit, which is the appropriate limit near the critical point. The path-integral formulation of the field theory then produces the correlation functions by conformal invariance and operator product expansion (chapter 4).
Best Answer
To expand on Steve Huntsman's comment, the entropy follows from Onsager's result for the free energy per site, $F=$ $$ -\beta^{-1}\left[\ln 2+ \frac{1}{2}\frac{1}{(2\pi)^2}\int_0^{2\pi}d\theta_1\int_0^{2\pi}d\theta_2 \ln(\cosh2\beta E_1\cosh2\beta E_2 -\sinh2\beta E_1\cos\theta_1-\sinh2\beta E_2\cos\theta_2)\right], $$ and the thermodynamic relation, $$ S=-\frac{\partial F}{\partial T}, $$ for the entropy per site. Here $\beta=1/(k_BT)$ and $E_1$ and $E_2$ are the horizontal and vertical interaction strengths. If you set both interaction strengths equal to 1 and use units where Boltzmann's constant equals 1, then the critical temperature is $2/\ln(\sqrt2 + 1)\approx2.269$. If you plot $S$, you should find that it interpolates between 0 at low temperature and $\ln2$ at high temperature, as expected. At the critical temperature, the graph has infinite slope.