The question: "I'm wondering if there is some good reason why the universe as we know it has to have twelve particles rather than just four."
The short answer: Our current standard description of the spin-1/2 property of the elementary particles is incomplete. A more complete theory would require that these particles arrive in 3 generations.
The medium answer: The spin-1/2 of the elementary fermions is an emergent property. The more fundamental spin property acts like position in that the Heisenberg uncertainty principle applies to consecutive measurements of the fundamental spin the same way the HUP applies to position measurements. This fundamental spin is invisible to us because it is renormalized away. What's left is three generations of the particle, each with the usual spin-1/2.
When a particle moves through positions it does so by way of an interaction between position and momentum. These are complementary variables. The equivalent concept for spin-1/2 is "Mutually unbiased bases" or MUBs. There are only (at most) three MUBs for spin-1/2. Letting a particle's spin move among them means that the number of degrees of freedom of the particle have tripled. So when you find the long time propagators over that Hopf algebra you end up with three times the usual number of particles. Hence there are three generations.
The long answer: The two (more or less classical) things we can theoretically measure for a spin-1/2 particle are its position and its spin. If we measure its spin, the spin is then forced into an eigenstate of spin so that measuring it again gives the same result. That is, a measurement of spin causes the spin to be determined. On the other hand, if we measure its position, then by the Heisenberg uncertainty principle, we will cause an unknown change to its momentum. The change in momentum makes it impossible for us to predict the result of a subsequent position measurement.
As quantum physicists, we long ago grew accustomed to this bizarre behavior. But imagine that nature is parsimonious with her underlying machinery. If so, we'd expect the fundamental (i.e. before renormalization) measurements of a spin-1/2 particle's position and spin to be similar. For such a theory to work, one must show that after renormalization, one obtains the usual spin-1/2.
A possible solution to this conundrum is given in the paper:
Found.Phys.40:1681-1699,(2010), Carl Brannen, Spin Path Integrals and Generations
http://arxiv.org/abs/1006.3114
The paper is a straightforward QFT resummation calculation. It assumes a strange (to us) spin-1/2 where measurements act like the not so strange position measurements. It resums the propagators for the theory and finds that the strange behavior disappears over long times. The long time propagators are equivalent to the usual spin-1/2. Furthermore, they appear in three generations. And it shows that the long time propagators have a form that matches the mysterious lepton mass formulas of Yoshio Koide.
Peer review: The paper was peer-reviewed through an arduous process of three reviewers. As with any journal article it had a managing editor, and a chief editor. Complaints about the physics have already been made by competent physicists who took the trouble of carefully reading the paper. It's unlikely that someone making a quick read of the paper is going to find something that hasn't already been argued through. The paper was selected by the chief editor of Found. Phys. as suitable for publication in that journal and so published last year.
The chief editor of Found. Phys. is now Gerard 't Hooft. His attitude on publishing junk is quite clear, he writes
How to become a bad theoretical physicist
On your way towards becoming a bad
theoretician, take your own immature
theory, stop checking it for mistakes,
don't listen to colleagues who do spot
weaknesses, and start admiring your
own infallible intelligence. Try to
overshout all your critics, and have
your work published anyway. If the
well-established science media refuse
to publish your work, start your own
publishing company and edit your own
books. If you are really clever you
can find yourself a formerly
professional physics journal where the
chief editor is asleep.
http://www.phys.uu.nl/~thooft/theoristbad.html
One hopes that 't Hooft wasn't asleep when he allowed this paper to be published.
Extensions: My next paper on the subject extends the above calculation to obtain the weak hypercharge and weak isospin quantum numbers. It uses methods similar to the above, that is, the calculation of long time propagators, but uses a more sophisticated method of manipulating the Feynman diagrams called "Hopf algebra" or "quantum algebra". I'm figuring on sending it in to the same journal. It's close to getting finished, I basically need to reread it over and over and add references:
http://brannenworks.com/E8/HopfWeakQNs.pdf
The Higgs field is a scalar field and it happens that the vacuum expectation value of that field is non-zero in our universe. It is this non-zero Higgs vacuum expectation value that gives the elementary fermions of the standard model of particle physics their rest mass. Now this Higgs field is a scalar so it is as if there is a single numerical value that specifies this field strength everywhere in space.
If you think of this scalar value as being like a the depth of the water in a swimming pool, then the Higgs Boson is like waves on that water. So a Higgs Boson causes small up and down variations in the Higgs value as it travels through space. And as a particle, it takes 125 GeV of energy to create the Higgs Boson. So the Higgs Boson can only be created where that much energy is available. Cosmic rays hitting the earth have energies that high and much higher so it is certainly possible for cosmic rays to be creating Higgs Boson particles independently of the LHC or any other human particle accelerator. However, these Higgs Bosons are certainly not bombarding every particle all the time.
However, I think what Dr Cox is talking about is the Higgs scalar vacuum expectation value that fills all of space. That is what is giving elementary fermions their rest mass, but this is not at all the same as the Higgs Bosons that were created and detected at the LHC. So I think Dr. Cox was taking some liberties to try to explain this complicated physics to non-physicist audiences.
Best Answer
I will try to address your question, though, as David says in the comments, it is evident that you have very little background in elementary particle physics. I will bring over an event much simpler than a display of an event that could show a Higgs particle decay.
Here is a simple antiproton annihilation event whose end particles are recorded by their passage through a bubble chamber which also has a magnetic field perpendicular to the picture. The antiproton enters from below and hits a proton which is at rest, so not visible, in the bubble chamber liquid. It annihilates and eight pions come out, their momentum measured by the curvature, their mass by the ionisation track.
Where is the Higgs field in this picture? It permeates everything and at the point of interaction when the pions materialize it has supplied the masses to the quarks and antiquarks that they are made up of.
The simulated Higgs event display you have attached shows the decay products of the Higgs Boson. This particle is predicted by the Standard Model and it is necessary to find and confirm it in order to validate the SM. It appears because a Higgs field exists, but it is a particle in the data set of particles predicted and mostly found by the SM. In the real experiment, a number of events
with two photons, for example, have been accumulated so that the claim of seeing a Higgs like particle has been established statistically.
A lot of work remains to make sure that the bump seen has really the decay branching ratios and spin and statistics expected from the SM before the discovery of the Higgs boson is established unequivocally. Then we could state with some certainty that the Standard Model which depends on the existence of a Higgs field is validated.
So it should be clear that each individual event is not like a spider that can be dissected. It is an instant of the materialization of the fields and the experiment has to accumulate enough events to statistically establish an observation that validates a hypothesis.