Suppose that $y_1(t), \ldots, y_n(t)$ are solutions of $\frac{d^n y}{dt} + p_{n-1}(t) \frac{d^{n-1} y}{dt} + \cdots + p_1(t) \frac{dy}{dt} + p_0(t) y = 0$, and suppose that their Wronskian is zero for $t = t_0$, i.e.
\begin{equation*}
\left|
\begin{array}{cccc}
y_1(t_0) & y_2(t_0) & \cdots & y_n(t_0) \\
y_1'(t_0) & y_2'(t_0) & \cdots & y_n'(t_0) \\
\vdots & \vdots & \ddots & \vdots \\
y_1^{(n-1)}(t_0) & y_2^{(n-1)}(t_0) & \cdots & y_n^{(n-1)}(t_0)
\end{array}
\right| = 0.
\end{equation*}
Then the corresponding matrix is not invertible, and the system of equations
\begin{array}{c}
c_1 y_1(t_0) &+& c_2 y_2(t_0) &+& \cdots &+& c_n y_n(t_0) &=& 0 \\
c_1 y_1'(t_0) &+& c_2 y_2'(t_0) &+& \cdots &+& c_n y_n'(t_0) &=& 0 \\
\vdots &+& \vdots &+& \ddots &+& \vdots &=& 0 \\
c_1 y_1^{(n-1)}(t_0) &+& c_2 y_2^{(n-1)}(t_0) &+& \cdots &+& c_n y_n^{(n-1)}(t_0) &=& 0 \\
\end{array}
has a nontrivial solution for $c_1, c_2, \ldots, c_n$ not all zero.
Let $y(t) = c_1 y_1(t) + \cdots + c_n y_n(t)$. Because $y(t)$ is a linear combination of solutions of the differential equation, $y(t)$ is also a solution of the differential equation. Additionally, because the weights satisfy the above system of equations, we have $y(t_0) = y'(t_0) = \cdots = y^{(n-1)}(t_0) = 0$.
These initial conditions and the original differential equation define an initial-value problem, of which $y(t)$ is a solution. If $p_0(t), p_1(t), \ldots, p_{n-1}(t)$ are continuous, then any initial-value problem associated with the differential equation has a unique solution. Obviously $y^*(t) = 0$ is a solution of the initial-value problem; since we know that $y(t)$ is also a solution of the same initial-value problem, it follows that $y(t) = 0$ for all $t$, not just $t = t_0$.
We now have $c_1 y_1(t) + \cdots + c_n y_n(t) = 0$ for all $t$, where $c_1, \ldots, c_n$ are not all zero. Thus the functions $y_1(t), \ldots, y_n(t)$ are linearly dependent.
Conversely, if the functions $y_1(t), \ldots, y_n(t)$ are linearly dependent, then the system of equations
\begin{array}{c}
c_1 y_1(t) &+& c_2 y_2(t) &+& \cdots &+& c_n y_n(t) &=& 0 \\
c_1 y_1'(t) &+& c_2 y_2'(t) &+& \cdots &+& c_n y_n'(t) &=& 0 \\
\vdots &+& \vdots &+& \ddots &+& \vdots &=& 0 \\
c_1 y_1^{(n-1)}(t) &+& c_2 y_2^{(n-1)}(t) &+& \cdots &+& c_n y_n^{(n-1)}(t) &=& 0 \\
\end{array}
has a nontrivial solution for every $t$, the corresponding matrix is not invertible for any $t$, and $W[y_1, \ldots, y_n](t) = 0$.
There are two points which are wrong:
First, the Wronskian of $x^2$ and $1$ is not identically $0$:
$$W(y_1, y_2)(x) = \begin{vmatrix} x^2 & 1 \\ 2x & 0 \end{vmatrix} = -2x.$$
Second: It is true for all differentiable functions $y_1$ and $y_2$ that: If $W(y_1, y_2) (x_0) \neq 0$ for at least one $x_0$, then $y_1$ and $y_2$ are linearly independent. But the backwards direction is not true in general, i.e., if $y_1$ and $y_2$ are linearly independent, then their Wronskian is not necessarily non-zero. Consider as a counterexample $y_1 = x^2/2$ and $y_2 = x\lvert x \rvert/2$. The backwards direction is true if you additionally assume that $y_1$ and $y_2$ are solutions of a linear second order differential equation.
Best Answer
You don't need the Wronskian to know these two functions are linearly independent on the interval $[-1,1]$ because two functions are linearly dependent just when one is a constant multiple of the other. That's clearly not the case here. These function are dependent on $[0,1]$ since they are identical. They are also dependent on $[-1,0]$.
At https://math.libretexts.org/Bookshelves/Analysis/Supplemental_Modules_(Analysis)/Ordinary_Differential_Equations/3%3A_Second_Order_Linear_Differential_Equations/3.6%3A_Linear_Independence_and_the_Wronskian you can read that
Note the implication there: if the Wronskian is nonzero then the functions are independent. You are invoking the converse. But nowhere does it say that an identically zero Wronskian implies dependence.
Your example shows that the converse is in fact false. You can find it in the wikipedia page that @ChristianBlatter notes in his comment.