In some degenerate cases there may be no such a one point (for instance, if all the lines are parallel). However there's a single solution in the general case.
I assume you're trying to solve a more general problem where all the lines are not required to intersect exactly (otherwise there's a much simpler solution than the least squares).
Derivation:
You say the every line is represented by two points. Let's rather work in the convention where a line is represented by one point and a direction vector, which is just a vector subtraction of those two points. That is, instead of describing a line by points $\mathbf{a}$ and $\mathbf{b}$ we'll describe it by a point $\mathbf{a}$ and a vector $\mathbf{d}$ whereas $\mathbf{d}=\mathbf{b}-\mathbf{a}$.
Our point (which we're trying to find) is $\mathbf{c}$.
The distance of this point to the line is:
$H=\frac{\|(\mathbf{c}-\mathbf{a})\times\mathbf{d}\|}{\|\mathbf{d}\|}$
Using identity $(\mathbf{a}\times\mathbf{b})\cdot(\mathbf{a}\times\mathbf{b})=\|\mathbf{a}\|^2\|\mathbf{b}\|^2-(\mathbf{a}\cdot\mathbf{b})^2$
we have:
$H^2=\frac{\|\mathbf{c}-\mathbf{a}\|^2\|\mathbf{d}\|^2-\|(\mathbf{c}-\mathbf{a})\cdot\mathbf{d}\|^2 }{\|\mathbf{d}\|^2}$
$H^2 = \|\mathbf{c}-\mathbf{a}\|^2-\frac{\|(\mathbf{c}-\mathbf{a})\cdot\mathbf{d}\|^2 }{\|\mathbf{d}\|^2}$
The square sum of the distances of the point $\mathbf{c}$ to all the lines is just the sum of the above expressions for all the lines. The problem is to minimize this sum. This sum depends on a variable $\mathbf{c}$ (which is actually 3 variables, the components of $\mathbf{c}$). This is a standard least squares problem, which generally has a single solution (unless there's a degeneracy).
Solving the least squares for this specific case.
Since we want find such a $\mathbf{c}$ that minimizes this sum, its derivative with regard to $\mathbf{c}$ should be zero. In other words:
$\frac{d(H^2)}{d\mathbf{c}}=2(\mathbf{c}-\mathbf{a})-2\mathbf{d}\frac{(\mathbf{c}-\mathbf{a})\cdot\mathbf{d}}{\|\mathbf{d}\|^2}$
$0=\sum_{i=0}^m{\mathbf{c}-\mathbf{a}^{(i)}-\mathbf{d}^{(i)}\frac{(\mathbf{c}-\mathbf{a}^{(i)})\cdot\mathbf{d}^{(i)}}{\|\mathbf{d}^{(i)}\|^2}}$
This gives 3 equations (since it's a vector equation) with 3 unknowns (components of $\mathbf{c}$).
In order to get to the minimum distance problem quickly, we quote a formula for the distance from the point $(x_0,y_0)$ to the line with equation $px+qy+r=0$. This distance is
$$\frac{|px_0+qy_0+r|}{\sqrt{p^2+q^2}}.$$
The site linked to gives a proof, as does Wikipedia. The distance from a point to a line has also been discussed more than once on StackExchange.
For our minimization problem, we can assume that one of the points is $(0,0)$, and the other is, say, $(a,b)$. So the line has equation $bx-ay=0$. We can assume that $a$ and $b$ are relatively prime, for if $d$ is their greatest common divisor, the same line is determined by $(0,0)$ and $(a/d,b/d)$.
Then by Bezout's Theorem, we can find integers $x_0$, $y_0$ between $0$ and $\max(a,b)$ such that $|bx_0-ay_0|=1$. Thus the smallest non-zero value of $|bx-ay|$ as $(x,y)$ ranges over our grid is $1$.
So the minimum possible non-zero distance from a gridpoint to our line is $\dfrac{1}{\sqrt{a^2+b^2}}$.
Now we want to maximize $a^2+b^2$, as $(a,b)$ ranges over relatively prime pairs on our grid. Then for fixed $a+b$, this maximum occurs when $a$ and $b$ differ by as little as possible. If $n=1$, the maximum value of $\sqrt{a^2+b^2}$ is achieved when $a=b=1$.
Suppose now that $n>1$. Then
the maximum value of $a^2+b^2$, given that $\gcd(a,b)=1$ and $(a,b)$ is on our grid is attained at $a=n-1$, $b=n$, and at $a=n$, $b=n-1$. The minimum non-zero distance is therefore
$$\frac{1}{\sqrt{2n^2-2n+1}}.$$
Comment: For our particular problem, we do not need Bezout's Theorem, since if $n>1$ then the distance from $(1,1)$ to the line through $(0,0)$ and $(n-1,n)$ is $1/(2n^2-2n+1)$. It is clear that we can't do better, since $|bx_0-ay_0|$ is at least $1$. However, if the $n\times n$ grid is replaced by an $m\times n$ grid, the natural analysis uses Bezout's Theorem.
Best Answer
As pointed out by the others, minimising the sum of distances is different from minimising the sum of squared distances. The former requires the use of an optimisation package. The latter is a least square problem that is fairly well studied with free solvers in abundance. Specifically: