Well, a lot of questions, some of which Theo already answered in a very nice way. Let me just give some additional remarks and hints how I think about DQ and Poisson geometry in relation to quantum physics.
Concerning the first question:
the good replacement (in view of Gel'fand duality) of a point on phase space is a (pure) state on the quantum algebra. While for $C^\ast$-algebras this is standard lore, in formal DQ things are slightly more tricky: of course you can argue that a formal star product yields not yet a quantum observable algebra as $\hbar$ does not have a "value" (say $1$ in your favorite unit system), so you should postpone the question of states till when you reach a "convergent/strict" DQ. This is often possible but in general completely unknown. Surprisingly, there is a good notion of states already for formal star products: essentially the same definition applies, take positive functionals of the algebra $C^\infty(M)[[\hbar]]$ which are $\mathbb{C}[[\hbar]]$-linear and take values in $\mathbb{C}[[\hbar]]$. To define positivity you make use of the fact that $\mathbb{R}[[\hbar]]$ is an ordered ring. Then many techniques of $C^*$-algebra theory can be carried over to this entirely algebraic framework. In fact, we have worked out many things like the GNS construction of representations etc.
Now the point is that a classically positive functional $\omega_0\colon C^\infty(M) \longrightarrow \mathbb{C}$ (which is a positive Borel measure with compact support by a smooth version of Riesz' Theorem) may no longer be positive with respect to a given star product $\star$. Thus you need to add higher order corrections $\omega = \omega_0 + \hbar\omega_1 + \cdots$ in order to gain positivity. It is a (quite non-trivial) theorem that this is always possible, even in a "differential" sense that all the higher orders are of the form $\omega_0 \circ D_r$ with some differential operator $D_r$.
You can apply this now to your favorite classical state, the delta-functional at a given point. The corresponding (non-unique) deformation is then the quantum analog of what a point can be, in some sense the best thing you can get.
The uncertainty principle can be understood as the reason why positivity fails for $\omega_0$ itself and why higher orders are necessary...
The second question: of course, for hard physical applications you only need $\mathbb{R}^{2n}$, maybe a cotangent bundle but that's it. Even a generic symplectic manifold is hard to motivate from this point of view.
But there are also reasons from physics why one should take care of DQ of more general Poisson manifolds:
a) Symmetries: whenever you have a classical symmetry encoded by a momentum map, then $\mathfrak{g}^\ast$ is a Poisson manifold. Quantizing a symmetry then amounts to quantize the momentum map in an appropriate way. There are several competing definitions but essentially all involve a DQ of the linear Poisson structure on $\mathfrak{g}^*$.
b) Aesthetics: to have a general framework in which you can discuss your relevant examples might be useful and open your view, even though the examples might be very very special inside this bigger framework.
c) Applications in NCG: many models of noncommutative space-times require more general Poisson structures to be quantized than just symplectic ones. It is even not clear that space-time allows for a symplectic structure at all, but it certainly carries interesting Poisson structures. In serious models of NC space-times, the Poisson structure itself should be treated as a dynamical quantity, i.e. a field. Then there is no reason why it should be non-degenerate everywhere. These models are of course all still very speculative...
d) Toy models: one can view complicated Poisson/symplectic manifolds as toy models for the infinite-dimensional phase spaces of classical field theories with gauge symmetries. Here the true phase spaces are sort of Marsden-Weinstein quotients (in ugly infinite dimensions) which can have quite generic geometry. So one tries to learn something about their quantization by looking at finite-dim models having at least also a complicated geometry.
Third question: Where is the Hilbert space...
After what I said for the first question, this is now pretty clear and follows the same line of argument as in AQFT: Having the algebra of observables, one takes a look at all $^*$-representations on, say pre-Hilbert spaces, by means of a GNS construction. The notion of pre Hilbert space works very much the same for ordered rings like $\mathbb{R}[[\hbar]]$. This has been worked out in detail in many places and gives indeed physically interesting results. The main advantage is now that one can take a look at different representations which can encode different physical situations...
OK, sorry for such a long blurp. I hope it gives some inspiration.
Best Answer
Physicist chiming in; "quantum corrections of a geometry" represents the vague idea that gravity should be quantum.
Classically, gravity is a geometric theory: "reality" is modelled as a (Lorentzian) manifold, whose metric can be found -- in principle -- by solving a system of PDEs. To be more specific, the metric $g$ is such that the classical action $S[g]$ is stationary, $$ S'[g]\equiv0 $$ where $S$ is, say, the Hilbert action $S=\int_X R$. (This is the canonical choice, which agrees with experiments to a great level of accuracy, but there are other choices that may work better in extreme regimes; a popular alternative is $S=\int_X f(R)$ for some appropriate function $f$, cf. f(R) gravity).
Quantumly, this does not longer hold. Gravity should still be a geometric theory, but "quantum corrected", meaning that the paradigm changes but should reduce to the standard (classical) framework in the classical limit (schematically written as $\hbar\to0$). This is the problem of quantum gravity. The intuitive idea is that the metric is no longer fixed by a system of differential equations, but "fluctuates": there should be a "measure" $\mu[g]$ in the "space of metrics" such that, when we perform a given experiment, a single metric $g$ is measured, drawn at random from this space of metrics, and with probability $\mu[g]$. The "expectation value" for the geometry of spacetime is $$ \langle g\rangle\sim\int \mu[g] $$ where the integral is over "the space of all metrics", whatever this means. A popular choice for the measure is $\mu[g]=e^{-S[g]/\hbar}$, which has the nice property that, in the $\hbar\to0$ limit, is peaked around the stationary point $S'[g]=0$, and so this measure appears to single out the classical configuration as a special point in this space.
Obviously, this is not a well-defined procedure, but it represents what we physicists would like to have. We leave it to the mathematicians to develop a theory of "integration over function spaces" strong enough to accommodate this our intuition.
But anyway, the take home message is that, classically, nature is supposed to be described as a manifold with a fixed (Lorentzian) metric. But if you go to very high energies (or very short distances), the classical picture breaks down and you need to take into account "quantum corrections": the metric is no longer fixed but becomes a random variable, peaked around the classical value but "fluctuating" according to some measure. Making this precise is the open problem in physics.