You can use the results given in that website to good approximation as long as the distance between your two points is much less than the radius of the Earth.
If your two points are $A$ and $B$ and their latitudes and longitudes are $\lambda_A$, $\phi_A$, and $\lambda_B$, $\phi_B$ ($\lambda$ is lat, $\phi$ is lon) then the distance between them in latitude is 69.172mi $\times$ ($\lambda_B - \lambda_A$). (I am assuming $\phi_A$ and $\phi_B$ are in degrees.)
Distance in longitude is a bit more complicated. The latitudes $\lambda_A$ and $\lambda_B$ should not differ by very much. (Which is a consequence of what I said above, that the distance between the two points should be much smaller than the radius of the Earth.) If this is the case, we can settle on a "compromise" $\lambda_C = (\lambda_A+\lambda_B)/2$. Then, distance in longitude is
$69.172$ mi $\times \cos (\lambda_C) \times (\phi_B - \phi_A)$.
Again, I am assuming that all angle are in degrees. I am also assuming you will have a $\cos$ function that calculates in terms of degrees.
So now you have your distance in lat and your distance in lon, so there's your vector. If you calculate as I describe, your numbers will be in miles. To get feet, multiply them both by 5280 ft/mi.
What to do
Express $A,B,C$ using Cartesian Coordinates in $\mathbb R^3$. Then compute
$$D=\bigl((A\times B)\times C\bigr)\times(A\times B)$$
Divide that vector by its length to project it onto the sphere (with the center of the sphere as center of projection). Check whether you have the correct signs; the computation might instead result in the point on the opposite side of the earth, in which case you'd simply flip all coordinate signs. The correct point is likely the one closer to e.g. the point $A+B$, so you can simply try both alternatives and choose the correct one.
Then turn the resulting Cartesian vector back into latitude and logitude.
How this works
The description above was obtained by viewing the sphere as the real projective plane. In that view, a point of the real projective plane corresponds to two antipodal points on the sphere, which is the source of the sign ambiguitiy I mentioned.
$P=A\times B$ is a vector orthogonal to both $A$ and $B$. Every great circle which is orthogonal to $AB$ will pass through the projection of $P$ onto the sphere. $Q=P\times C$ is orthogonal to both $P$ and $C$, so it is orthogonal to the great circle which connects $C$ with $P$ (resp. its projection onto the sphere). That great circle is the one which also connects $C$ and your desired $D$. $D=Q\times P$ is orthogonal to both $P$ and $Q$, so it lies both on the greatcircle $AB$ and the greatcircle $CD$. Therefore it must point in the direction of the desired point. Project onto the sphere, choose the correct point from the antipodal pair, and you have the solution.
Best Answer
If a spherical earth is good enough, you can convert the three points to Cartesian coordinates: $x=R \cos \phi \cos \lambda, y=R \cos \phi \sin \lambda, z=R \sin \phi$. Then subtract to get the two vectors from where you are to the other two points and use the dot product formula. This will give the angle in space between the vectors.