Suppose that $y_1(t), \ldots, y_n(t)$ are solutions of $\frac{d^n y}{dt} + p_{n-1}(t) \frac{d^{n-1} y}{dt} + \cdots + p_1(t) \frac{dy}{dt} + p_0(t) y = 0$, and suppose that their Wronskian is zero for $t = t_0$, i.e.
\begin{equation*}
\left|
\begin{array}{cccc}
y_1(t_0) & y_2(t_0) & \cdots & y_n(t_0) \\
y_1'(t_0) & y_2'(t_0) & \cdots & y_n'(t_0) \\
\vdots & \vdots & \ddots & \vdots \\
y_1^{(n-1)}(t_0) & y_2^{(n-1)}(t_0) & \cdots & y_n^{(n-1)}(t_0)
\end{array}
\right| = 0.
\end{equation*}
Then the corresponding matrix is not invertible, and the system of equations
\begin{array}{c}
c_1 y_1(t_0) &+& c_2 y_2(t_0) &+& \cdots &+& c_n y_n(t_0) &=& 0 \\
c_1 y_1'(t_0) &+& c_2 y_2'(t_0) &+& \cdots &+& c_n y_n'(t_0) &=& 0 \\
\vdots &+& \vdots &+& \ddots &+& \vdots &=& 0 \\
c_1 y_1^{(n-1)}(t_0) &+& c_2 y_2^{(n-1)}(t_0) &+& \cdots &+& c_n y_n^{(n-1)}(t_0) &=& 0 \\
\end{array}
has a nontrivial solution for $c_1, c_2, \ldots, c_n$ not all zero.
Let $y(t) = c_1 y_1(t) + \cdots + c_n y_n(t)$. Because $y(t)$ is a linear combination of solutions of the differential equation, $y(t)$ is also a solution of the differential equation. Additionally, because the weights satisfy the above system of equations, we have $y(t_0) = y'(t_0) = \cdots = y^{(n-1)}(t_0) = 0$.
These initial conditions and the original differential equation define an initial-value problem, of which $y(t)$ is a solution. If $p_0(t), p_1(t), \ldots, p_{n-1}(t)$ are continuous, then any initial-value problem associated with the differential equation has a unique solution. Obviously $y^*(t) = 0$ is a solution of the initial-value problem; since we know that $y(t)$ is also a solution of the same initial-value problem, it follows that $y(t) = 0$ for all $t$, not just $t = t_0$.
We now have $c_1 y_1(t) + \cdots + c_n y_n(t) = 0$ for all $t$, where $c_1, \ldots, c_n$ are not all zero. Thus the functions $y_1(t), \ldots, y_n(t)$ are linearly dependent.
Conversely, if the functions $y_1(t), \ldots, y_n(t)$ are linearly dependent, then the system of equations
\begin{array}{c}
c_1 y_1(t) &+& c_2 y_2(t) &+& \cdots &+& c_n y_n(t) &=& 0 \\
c_1 y_1'(t) &+& c_2 y_2'(t) &+& \cdots &+& c_n y_n'(t) &=& 0 \\
\vdots &+& \vdots &+& \ddots &+& \vdots &=& 0 \\
c_1 y_1^{(n-1)}(t) &+& c_2 y_2^{(n-1)}(t) &+& \cdots &+& c_n y_n^{(n-1)}(t) &=& 0 \\
\end{array}
has a nontrivial solution for every $t$, the corresponding matrix is not invertible for any $t$, and $W[y_1, \ldots, y_n](t) = 0$.
You can directly use Abel's identity to show that if the Wronskian of any two solutions of the differential equation $y''+p(x)y'+q(x)y = 0$ (on an interval $I$) is constant, then $p(x) = 0$. (If $\int_{x_0}^{x} p(t) dt$ is constant $\forall$ $x \in I$, then $p(t) = 0 $)
The answer to your last question can be found here.
Best Answer
With
$w[y_1, y_2] = \det \left ( \begin{bmatrix} y_1 & y_2 \\ y_1' & y_2' \end {bmatrix} \right ) = y_1y_2' - y_2y_1', \tag 1$
we find
$w'[y_1, y_2] = (y_1y_2' - y_2y_1')'$ $= y_1'y_2' + y_1y_2'' - y_2'y_1' - y_2y_1'' = y_1y_2'' - y_1''y_2; \tag 2$
if we now write the equation
$y'' + py' + qy = 0 \tag 3$
in the form
$y'' = -py' - qy, \tag 4$
we have
$y_i'' = -py_i' - qy_i, \; i = 1, 2, \tag 5$
which we may substitute into (2), yielding
$w'[y_1, y_2] = y_1y_2'' - y_1''y_2$ $= y_1(-py_2' - qy_2) - y_2 ( -py_1' - qy_1)$ $= -py_1y_2' - qy_1y_2 + py_1'y_2 + qy_1y_2$ $= p(y_1'y_2 - y_1y_2') = -pw[y_1, y_2]; \tag 6$
we thus see that $w[y_1, y_2]$ satisfies the ordinary differential equation
$w'[y_1, y_2] = -pw[y_1, y_2]; \tag 7$
the unique solution to this equation taking the value $w[y_1, y_2](t_0)$ at $t = t_0$ is readily seen to be
$w[y_1, y_2](t) = w[y_1, y_2](t_0) \exp \left ( -\displaystyle \int_{t_0}^t p(s) \; ds \right ). \tag 8$
Since
$\exp \left (- \displaystyle \int_{t_0}^t p(s) \; ds \right ) \ne 0, \; \forall t \in I, \tag 9$
we deduce that
$w[y_1, y_2](t) \ne 0, \forall t \in I \Longleftrightarrow \exists t_0 \in I, \; w[y_1, y_2](t_0) \ne 0. \tag{10}$
(10) is in fact an expression of the equivalence of points (II) and (III); to see that these imply (IV), we assume that that $y_1$ and $y_2$ are in fact linearly dependent; then there exist
$a, b \in \Bbb R, \tag{11}$
not both $0$, with
$ay_1 + by_2 = 0; \tag{12}$
then
$ay_1' + by_2' = 0, \tag{13}$
and the columns of the matrix
$\begin{bmatrix} y_1 & y_2 \\ y_1' & y_2' \end {bmatrix} \tag{14}$
are linearly dependent; but this implies
$w[y_1, y_2](t) = 0, \; \forall t \in I, \tag{15}$
contradicting (II), (III), and thus the linear independence of $y_1$, $y_2$ is established; likewise, if we assume (IV), then the columns of the matrix (14) must be linearly independent, and hence
$w[y_1, y_2](t) \ne 0, \forall t \in I; \tag{16}$
thus (II) and (III) bind.
Finally, the set $S$ of all solutions to (3) is well-known to be a two-dimensional vector space over $\Bbb R$; clearly
$\langle y_1, y_2 \rangle \subset S; \tag{17}$
since $y_1$ and $y_2$ are linearly independent,
$\dim \langle y_1, y_2 \rangle = 2 = \dim S, \tag{18}$
from which it follows that
$\langle y_1, y_2 \rangle = S, \tag{19}$
which is (I); likewise, (I) implies that $\langle y_1, y_2 \rangle$ are linearly independent, lest
$\dim \langle y_1, y_2 \rangle < 2 = \dim S, \tag{20}$
and
$\langle y_1, y_2 \rangle = S \tag{21}$
is impossible since the dimensions don't agree.
Thus we see that
$[(II) \equiv (III)] \equiv (IV) \equiv (I), \tag{22}$
i.e., that
$(I) \equiv (II) \equiv (III) \equiv (IV), \tag{23}$
$OE\Delta$.