If you recall from linear algebra, abstract functional spaces can be considered as vector spaces. We define the zero function to serve as the zero vector, and pointwise addition/multiplication as the vector space operations.
For a collection of normal vectors, to show linear independence, we want to show none of the chosen vectors can be written as a linear combination of any of the others; each vector describes a 'different' part of the space. For a vector space of functions, e.g. the space of differentiable functions, to show linear independence, we must show there are no non-zero scalars $a,b$ such that
\begin{align*}
af(t)+bg(t)=0
\end{align*}
for $\textit{all}$ values of $t$ in the domain. It isn't enough that we can find one or two values of $t$ where the sum equals zero, but instead for all values of the domain.
Now, to show the two functions in your problem are linearly independent. Suppose we could find two non zero numbers $a$ and $b$ such that
\begin{align*}
ae^{-t}+be^{-4t}=0
\end{align*}
For all $t$. Differentiate this expression.
\begin{align*}
-ae^{-t}-4be^{-4t}=0
\end{align*}
Adding the two equations together gives
\begin{align*}
-3be^{-4t}&=0
\end{align*}
The exponential function is strictly positive, so we must have $b=0$. This implies $ae^{-t}=0$, and similarly $a=0$.
Now, why must a linear combination also be a solution? Well the differential equation described is $\textbf{linear}$, so any element of the span of linearly independent solutions will always yield another solution. They all get mapped to zero.
Consider a general $n$-th order linear homogeneous ODE:
$$\sum_{i=0}^na_i(t)y^{(i)}(t)=0\tag{1}$$
where $a_n(t)\neq 0$ on a certain interval $J$ around a point $t_0$.
First, we should emphasize that solutions of the ODE $(1)$ are only defined on some small intervals. So what we want to prove is:
If $t_0$ is a point on which all $a_i$ are defined and $C^1$ on some interval around $t_0$, and $a_n(t_0)\neq 0$, then there exists an interval $I$ around $t_0$ such that the solution set of $(1)$ has dimension $n$.
Conversely, if $K$ is any interval on which $a_n\neq 0$, then there exist $n$ linearly independent solutions $y_1,\ldots,y_n$ of $(1)$, then these solutions generate the whole solution set of $(1)$ on $K$.
The easiest way is to use the Picard-Lindelof Theorem, which works in $\mathbb{R}^n$. Here is a restricted version of the general statement:
Theorem [Picard-Lindelof]: Consider a Cauchy problem
$$y'(t)=F(t,y(t)),\qquad y(t_0)=(f_1,\ldots,f_n)\tag{2}$$
where $F=F(t,x_1,\ldots,x_n)$ is a function, $C^1$ from some neighbourhood of $(t_0,f_1,\ldots,f_n)$ to $\mathbb{R}^n$. Then
- There exists an interval $I$ around $t_0$ and a fucntion $y=y(t)$ which is a solution of $(2)$. (This is the existence of solutions.)
- Any two solutions $y_1,y_2$ of $(2)$, defined on intervals $I$ and $K$, coincide on $I\cap K$. (This is the "uniqueness" of solutions.)
Solving the ODE $(1)$ is equivalent to solving the following ODE on $n$ dimensions: Again, recall that we assume $a(t_0)\neq 0$ on some interval $J$ around $t_0$. Let $z(t)=(z_0(t),\ldots,z_{n-1}(t))$. Then consider the problem
$$z'(t)=F(t,z(t))\tag{3}$$
where $F(t,x_0,\ldots,x_{n-1})=\left(x_1,\ldots,x_{n-1},-\sum_{i=0}^{n-1}\frac{a_i(t)}{a_n(t)}x_i\right)$.
The equation $(3)$ means that
$$z_0'=z_1,\qquad z_1'=z_2,\ldots\qquad\text{or more generally }z_i=z_0^{(i)},$$
and $z_{n-1}'=-\sum_{i=0}^{n-1}\frac{a_i(t)}{a_n(t)}z_i$.
Therefore, $z$ is a solution of $(3)$ iff $y:=z_0$ is a solution of $(1)$. Moreover, the map $z\mapsto z_0=y$ is a linear bijection from the solution space of $(3)$ to the solution space of $(1)$, so they have the same dimensions.
The function $F$ is defined and $C^1$ on $J\times\mathbb{R}^n$. Take a basis $e_1,\ldots,e_n$ of $\mathbb{R}^n$ and solutions $z^1,\ldots,z^n$ of $(3)$, defined on some interval $I$ around $t_0$, and satisfying $z^i(t_0)=e_i$. These $z^i$ are clearly linearly independent.
If $z$ is any other solution of $(3)$ on $I$, write $z(t_0)=\sum\lambda_i e_i$. By the uniqueness part of Picard-Lindelof, $z=\sum\lambda_i z^i$ on $I$.
Best Answer
Any function of the form $(x,y)\mapsto ax+by+c$ is a solution, so the answer is no.
By the way, you also have $(x,y)\mapsto e^x\sin(y)$, and more generally any harmonic function on the real plane is a solution.