Is it possible for two vector functions of, for the moment's simplicity, one variable be both independent and dependent?
The reason I'm asking this is because on a problem from a book of mine (not homework), they put the following exercise:
Let $x^{(1)}(t)=\left (\begin{array}{cc} e^t \\ te^t\end{array} \right)$ and $x^{(2)}(t) = \left ( \begin{array}{cc} 1 \\ t \end{array} \right )$ . Show that $x^{(1)}
(t)$ and $x^{(2)}(t)$ are linearly dependent at each point in the interval $0 ≤ t ≤ 1$.
Nevertheless, show that $x^{(1)}(t)$ and $x^{(2)}(t)$ are linearly independent on $0≤t≤1$.
I would think that they're linearly dependent because $x^{(1)}(t)$ can simply be divided by the scalar $\frac{1}{e^t}$ (this is allowed because it is a never-zero exponential) to be equal to $x^{(2)}(t)$ $\forall t$, but because of the question I'm not too sure.
Could you give me some insight and/or guidance?
Best Answer
Bad notation is bad....
What you're being asked to prove here is that given $t\in [0,1]$, the vectors $x^{(1)}(t)$ and $x^{(2)}(t)$ are linearly dependent. There is nothing wrong with this.
What you're being asked to prove here, is that the vectors $x^{(1)}, x^{(2)}\colon [0,1]\to \mathbb R^{2\times 1}$ (vectors as in elements of a certain vector space - for example that of functions from $[0,1]$ to $\mathbb R^{2\times 1}$ - this vectors happen to be functions) are linearly independent.
That is, you're being asked to prove that $$\forall \alpha, \beta \in \mathbb R\left[\forall t\in [0,1]\left(\alpha x^{(1)}(t)+\beta x^{(2)}(t)=\begin{pmatrix}0\\0\end{pmatrix}\right)\implies \alpha =0=\beta\right].$$
Here the author is looking at $x^{(1)}(t)$ and $x^{(2)}(t)$ as if they were functions, which they are not. The notation is wrong. Correct would be: