The precise, mathematical statement of the uncertainty principle is $\sigma^2_x \sigma^2_k \geq 1/4$. The use of deltas is just an informal way of talking about it. Nevertheless, it's pretty common to say, for instance, that the width of a peak is either the standard deviation or some quantity proportional to it--see, for example, full width at half maximum, which ends up being about $2.35\sigma$.
I'm not really sure what a slit would look like in 1 dimension. It's easier for me to consider a particle in a 1d infinite square well. Note that the infinite well absolutely forbids any leakage of the particle into the forbidden region, just like the classical case. In this case, the variances depend on the energy of the particle. For a particle in one of the $n$th energy eigenstate of an infinite well with width $L$, the variances are (per wikipedia)
$$\sigma^2_x = \frac{L^2}{12} \Bigg ( 1 - \frac{6}{n^2 \pi^2} \Bigg), \quad \sigma_k^2 = \frac{n^2 \pi^2}{L^2}$$
The product of the variances is then $\sigma_x^2 \sigma_k^2 = (n^2 \pi^2/12 - 1/2)$. For $n=1$, this is about $.322 \geq .25$, as required.
You can't really see what the uncertainties will be by inspection, by the geometry of the problem. These are the uncertainties for energy eigenstates, and there's no reason to expect that a particle will be in an eigenstate (which would then make the computation more complicated).
Really, one just calculates the variances of the wavefunction with respect to $x$ and $k$. You might be able to get a rough idea from the quantities in the problem (for instance, the standard deviation with respect to $x$ is indeed proportional to $L$, but only proportional, not exactly $L$), but that's all.
You ask if a particle has nonzero probability of existing everywhere. To be pedantic, a particle has zero probability of existing at any specific point, but it typically has a nonzero probability of existing in a region of any finite size. This infinite square well is an exception, as the infinite potential around the box absolutely forbids particles.
Uncertainty really is just a loose, loose word to use. It almost always really means standard deviation of the wavefunction.
I would boldly claim that this thought experiment (also known as the Heisenberg microscope) is simply the wrong picture to understand the origin of uncertainty principle. The reason why it is so is because it mixes up between uncertainty due to measurement and uncertainty due to quantum state; nonetheless it had made its way into numerous textbooks and confused numerous undergraduates (including me) by including quantum mechanical objects such as electrons and photons and giving some results that has the factor $\hbar$ in it.
I will try to explain this confusing business to the best of my abilities about your questions in three parts - firstly, what is Heisenberg uncertainty principle; secondly, why is it unique to quantum mechanics; and finally, why the Heisenberg microscope is a wrong way of understanding the uncertainty principle. I am sorry that I may have to include a bit of maths from time to time, but I hope you will follow (and I hope I am right about this - do comment if I made mistakes).
Firstly, what is uncertainty principle? The best way that I know of to understand it physically is the following scenario: imagine that you have prepared a huge quantity of identical quantum states, and you measured the position of half of these states and the momentum of the rest with perfect precision (see below). At the end of the day, you will obtain a list of positions and momenta, you will notice that these results do have uncertainties due to the probabilistic nature of quantum states.
Here is where the uncertainty principle kicks in: regardless of what quantum state you prepared in the first place, if you calculate the uncertainties of positions and momenta respectively by the data you obtained from that long list, it will always be the case the uncertainties calculated from the list obey the uncertainty principle $\Delta x \Delta p \geq \hbar/2$. A more interesting way of rephrasing it would be you can never prepare a quantum state of which the uncertainties calculated from the list $\Delta x \Delta p$ is smaller than $\hbar/2$.
Before moving on, it is worth discussing a few things in this imaginary scenario. First thing is obviously what do I mean by the phrase with perfect precision? I certainly do not mean that there is some 'position' and 'momentum' that the quantum state has prior to measurement, what I meant is that the measurement results are completely due to the quantum states themselves, and are subjected to no external disturbance by other physical systems. Well you may argue that it is physically impossible to do that for any experimental apparatus would introduce some perturbation of the system, but since we are living in the imaginary thought experiment well we get to decide what we can do and what we can't do.
And here's the point which is very important under the context of the problem: even in this ideal world we can obtain positions and momenta directly from the quantum states, the uncertainty principle still holds. Throw away apparatus like the microscope or any other fancy equipments, you still have uncertainty - and this property is fundamentally due to the nature of quantum states themselves.
Still need convincing why this is justified? Well here we enter the second part on uncertainty due to measurements. Look back to any experiments with classical systems - you can almost certainly find no experiments where there is 0% uncertainty as there are bound to be errors introduced by the environment; nonetheless it doesn't stop us from imagining a perfect experiment where the results are completely due to the physical system we are studying. Say you are measuring acceleration due to gravity in a lab - you can be certain that almost nobody will ever get $9.80665$ meter per second squared (unless you are a cheater) because of errors due to gravitational attraction to the surrounding objects, the grids on your ruler are not fine enough, etc. etc. But you have no problem convincing yourself that under the perfect and ideal condition you still will be able to get $9.80665$.
And the crux of the matter here is that the uncertainty due to environment (or errors) happens to all systems, be it classical or quantum. Nonetheless, the uncertainty principle only applies to quantum systems. In Newtonian mechanics, you can characterise the motion of a particle in one dimension by a pair of quantities $(x,p)$, or position and momentum, and you can make it such that following the experimental procedure we described in part 1, that by preparing a huge number of identical states and measure their positions and momenta, $\Delta x \Delta p \leq \hbar/2$. In fact, it is very easy: by preparing a bunch of particles having same position and momentum, $\Delta x = \Delta p = 0$. But in quantum mechanics, it simply cannot be done, because we are talking about an entirely different beast here: instead of $(x,p)$, you need to describe a quantum state with a wavefunction $|\psi\rangle$, and they must obey the uncertainty principle.
So here we are at the third part - why is the Heisenberg microscope the wrong picture to understand the origin of uncertainty principle. I suspect that you can now already answer that - the thought experiment basically attributes the origin of the uncertainty principle to error introduced in the experiment, but not the quantum state itself. In a perfect experiment, according to Heisenberg microscope, there will be no uncertainty; we can even try to perceive measuring the position and momentum of the electron using other methods - say shooting one electron off a gun and bouncing them off by a wall (maybe?) - that can give you uncertainties below the uncertainty principle according to the picture described by Heisenberg microscope. But this is simply not the case and you simply can't do that - because the state is described by a wavefunction $|\psi \rangle$, but not a pair of $(x,p)$, so it is simply wrong to use '$x$' or '$p$' to describe the electron.
This also leads to the complication about interactions, as you have mentioned in your question. The interaction between photon and electron cannot be simply described by 'momentum transfer' for this implicitly assumes that the physical state photons and electrons are characterised by some momenta. As stated before, the interaction can only be described in terms of $|\psi\rangle$; and to be absolutely strict the best way of understanding such interaction is from QED, rather than this semi-classical picture. Nonetheless, let me reiterate my statement again - remove the interaction (regardless of whether it is photon-electron interaction or whatever physical processes you use to probe the electron), you still have uncertainty principle, because it is a fundamental property of a quantum state.
Regardless, I suspect the reason why the Heisenberg microscope is so successful is because the way it mixes quantum mechanical interactions between electrons and photons and classical interpretations to give results involving the infamous $\hbar$ simply by manipulating the errors introduced in an experiment, which gives us the illusion that we can intuitively understand the uncertainty principle and it is simply not the case. I feel it's fitting to use this (mis)quote - certainly quantum mechanics has never allowed herself to be won; and at present every kind of intuition stands with sad and discouraged mien—IF, indeed, it stands at all! - but this is, I guess, why we love quantum mechanics so much :)
Best Answer
Nick, Don't be surprised that this is confusing. There are a lot of concepts intermixed in the discussion of the uncertainty principle that are frequently not clearly understood and are intertwined unintentionally.
Although one often sees that these are stated in statistical terms, the standard deviation does not directly require multiple observations of a sample to understand. Traditional statistics does rely upon repeated sampling in order to develop a standard deviation, however in quantum mechanics the idea is more closely associated with properties associated with the Fourier transform.
To understand the Fourier transform one must first understand what a Fourier series is. The hyperlink will take you to a discussion about the Fourier series as it relates to sound. Starting at about minute two you see a representation of a saw-tooth like wave form. When they show you in the video how the saw-tooth like wave has many components, those components are determined by performing a Fourier transform. In many cases, they transform time series functions into frequency functions (which is directly proportional to energy) but the transform is also applicable to situations where one is transforming position into momentum.
Essentially what happens, is that if one wants to have complete certainty in the value of momentum (or energy), one must look at the entire position (or time) spectrum. In other words, a definite position, when transformed into the momentum domain, requires the entire momentum domain. If one allows a little uncertainty in the position, one does not require the entire momentum domain.
This relationship can be well defined as it relates to Fourier Transforms. This is the real source of the uncertainty principle, and does not require a statistical interpretation to understand.