The answer to this question is a surprising no, and this is not because we don't have enough quantum systems. We have plenty. The problem is that if a natural system with a large number of particles is implementing a computation that requires exponential resources, we can't do the computation to check if quantum mechanics is accurate. Quantum mechanics might be failing all the time in highly excited highly entangled nuclear states, but we wouldn't know it, because we can't compute the exact energy levels, we can only rely on experiment.
First, for A, every quantum system with a large number of particles and strong interactions is implementing a nontrivial quantum computation, but we can't check if it is doing it correctly. For example, if you excite a uranium nucleus to a very highly excited state, so that it can emit x-rays, neutrons, and protons, and look at the radiated spectrum of stuff, the amplitudes for emission are highly complicated product of a 250 particle system with impossible-to-calculate entanglements. These calculations simply can't be done by any classical computer, so we just wouldn't know whether quantum mechanics is failing. But yes, a uranium nucleus in a 700MeV excited state is performing an impossibly complex quantum computation that we can't compute even with a computer the size of the universe.
For B--- your question is nonsensical, but the speed of light does limit the information transfer speed in a computer. This is not much of a limitation of principle, because it just says that a computation step which moves data from point A to point B will take a time proportional to the distance between A and B. This has no bearing on the computational complexity, because you can do the motion in polynomial time in the size of your memory, even if it is inefficiently layed out in a straight line. This is a red herring. The words "this is the maximum speed a massless particle can compute a resolved path for when traveling through a vacuous quantum field" are meaningless.
For C: the answer here is no--- you can just have classical mechanics, which does not require infinite sums to resolve the answer. The idea that quantum mechanics is required for reproducing classical definite answers is strange, because it is actually mysterious how this happens. In order to produce definite results from the quantum superposition muddle, you need to assume that we are splitting entities in a many-worlds picture, or else to put in definite-making laws by hand which do the same thing effecively. If nature is fundamentally classical, this is not going to matter.
Comments on the Linked Discussion
The argument Gil Kalai makes is interesting, but it is phrased poorly. Christopher Moore made the point cogently in the first of the comments here: http://rjlipton.wordpress.com/2012/01/30/perpetual-motion-of-the-21st-century/ , and I do not want to repeat too much. When you are proposing that quantum computation will fail, you are proposing that quantum mechanics is incorrect, and the failure occurs for highly entangled large physical systems.
The argument against quantum mechanics from the implausibility of a physical system doing an exponential computation is completely different from other arguments against quantum mechanics. The philosophical principle is that nature can't be that much more computationally rich than we are, because this introduces a mystical element of in-principle uncomputability in large quantum systems in nature. This principle is new as far as the literature is concerned--- but it is not due to Gil Kalai. I heard it first from CS student Abram Connely a decade ago, this was his personal beef with quantum mechanics. I found it a persuasive and interesting point, and I tried to give it an exposition in my answer here: Consequences of the new theorem in QM? . The precise formulation Kalai gives is interesting, but formulated in a sub-optimal way.
In order to believe that quantum computation is impossible, you absolutely requires a new law of physics which replaces quantum mechanics, or at least a principle to determine how quantum mechanics fails. The statement that the failure is fundamental, because the universe can't be that complicated, requires you to at least try to specify how the universe can be simplified.
It is incorrect to argue that simple implementation noise makes quantum computation infeasable, without making a proposal that there is a law of nature forbidding quantum computing entanglements. The reason is that you can just remove the noise by cooling down the system, and making the parts precise. There is no in principle limit for quantum computing size, even without error correction. Quantum error correction is central to implementation in practice, but in principle, you can just imagine a perfect computer, and come closer and closer in a colder and colder system, with no limit except how much you are willing to spend.
A failure of quantum mechanics that only affects mutual entanglements of a large number of quantum particles can easily have escaped detection, but when proposing modifications to quantum mechanics, one must check that they do not lead to things that would not have escaped detection. Things like failure of energy conservation, failure of few-particle coherence, irreversible information loss in few-body systems, friction in atomic motion, and all sorts of other things.
In order to check these things, it is insufficient to formulate the computational failure principle about an abstract computing device. One must show how this principle modifies real atomic scale wavefunction dynamics. The idea that this is a nonlinearity in the Schrodinger equation is just bad, so if you are proposing such a modification, it should be because the SE is an emergent description of a fundamentally classical system.
These ideas are due to t'Hooft, who is also skeptical of quantum computation, mostly for the same reason Einstein was skeptical of quantum mechanics. t'Hooft has given several attempts at a model for how to replace quantum mechanics with a system which will not be capable of exponential computation, and if one is proposing fundamental decoherence (which is what Gil Kalai is doing), one should do so in the context of at least a speculation on the underlying substrate.
It's impossible to know whether the universe if finite or infinite because we'll never be able to see it all. Note that genneth says "and for simplicity the universe is infinite", and this is the key point really. It makes Physics simpler if the universe is infinite so we tend to assume it is.
But you need to consider what you mean by "infinite". It doesn't make sense to say the universe has an edge, because you then have to ask what happens if you go up to the edge then take one more step. That means the only alternative to the universe being infinite is that it loops back on itself like a sphere, so you can walk forever without reaching an edge, but eventually you'll be back where you started.
We don't think the universe is like a sphere because for that spacetime would have to have positive curvature, and experiments to date show space is flat (to within experimental error). However spacetime could be positively curved but with such small curvature that we can't detect it. Alternatively spacetime could be flat but have a complex global topology like a torus. The scale of anything like this would have to be larger than the observable univrse otherwise we'd have seen signs of it.
Incidentally, if the universe is infinite now it has always been infinite, even at the Big Bang. This is why you'll often hear it said that the Big Bang wasn't a point, it was something that happened everywhere.
Later:
I've just realised that you also asked the question about time beginning at the Big Bang. In the answer to that question I explained how you use the metric to calculate a geodesic, with the result that you can't calculate back in time earlier than the Big Bang. You can also use the metric to calculate a line in space at a fixed value of time (a space-like geodesic). Our universe appears to be well described by the FLRW metric with $\Omega$ = 1 that I mentioned in the other question, and if you use this metric to calculate your line you find it goes on forever i.e. the universe is infinite.
But then no-one knows for sure if the FLRW metric with $\Omega$ = 1 is the right one to describe our universe. It's certainly the simplest.
Best Answer
Your question is good, but dangerously edgy even to try to answer. Alas, since I seem at times I'm prone to trying rather than doing nothing at all... :)
Let me suggest an intentionally different way of approaching your question: Only conservation is absolute. Both continuous and discrete behaviors are approximate and mutable expressions of the absolute conservation of certain quantities.
I'll point to the curious mix in quantum theory of continuous wave functions and discrete outcomes as a possible example. The most accurate way to represent a wave function mathematically is as precisely continuous, yet that same continuous perfection can only be accessed experimentally in terms of discrete result that sample many such nominally perfect wave functions. But the fully discrete particle view never fully wins either, since for example detecting an absolutely positioned particle is a physical impossibility in our universe. There is instead a sort of "bounce point" between the two views, one whose scale is captured by Planck's constant.
But what does always apply without exception in analyzing quantum problems, even across light years of separation in cases of entanglement,[1] is the absolute and unyielding conservation of a certain small set of properties that includes mass-energy, charge, momentum, spin, and a few more obscure quantities such as $T_3$. So why not just declare these conservation rules to be the real absolutes, with the variable interplay we observe between continuous and discrete views as more of an emergent perspective on how the conservation rules play out over time?
So: Since you asked a good but highly speculative question, I hope readers of this answer will have some mercy on me for giving an answer. While I don't think my answer is exactly radical -- few would debate the importance of the absolute conservation laws in physics, I think! -- I fully admit that it is highly speculative in terms of the priorities I am suggesting.
[1] Focusing on conservation first puts entanglement in a rather different light. It suggests that far from being an odd or minor side effect of QM, entanglement at the classical level reflects the unresolved remnants of deeper conservation laws that mostly work themselves out into something we call "locality of effect" when they are expanded out in a self-consistent fashion over that curious dimension we call time. By "time" in this context I mean the classical, entropic, macroscopic time we know on a daily basis. The quantum version of time, the wonderfully symmetric one, occurs when one or more of those absolute conservation laws insists on keeping its options open. That openness, expressed as the uncertainty principle, makes the irreversible time we know a lot less relevant at the quantum level.