The answer is $k = 0, \pm 1, \pm \frac{1}{2}$. This follows from the following result.
Claim: The functions $\{ 1, \sin rt, \cos rt \}$ for $r$ a positive real are linearly independent over $\mathbb{R}$.
Proof 1. Suppose that $\sum s_r \sin rt + \sum c_r \cos rt = 0$ is a nontrivial linear dependence. Consider the largest positive real $r_0$ such that $c_{r_0} \neq 0$. Take a large even number of derivatives until the coefficient of $\cos r_0 t$ is substantially larger than the remaining coefficients of the other cosine terms and then substitute $t = 0$; we obtain a number which cannot be equal to zero, which is a contradiction. So no cosines appear.
Similarly, consider the largest positive real $r_1$ such that $s_{r_1} \neq 0$. Take a large odd number of derivatives until the coefficient of $\cos r_1 t$ is substantially larger than the remaining coefficients of the other cosine terms (which come from differentiating sine terms) and then substitute $t = 0$; we obtain a number which cannot be equal to zero, which is a contradiction. So no sines appear.
So $1$ is the only function which can appear in a nontrivial linear dependence, and so there are no such linear dependences.
Proof 2. It suffices to prove that the functions are all linearly independent over $\mathbb{C}$. Using the fact that
$$\cos rt = \frac{e^{irt} + e^{-irt}}{2}, \sin rt = \frac{e^{irt} - e^{-irt}}{2i}$$
it suffices to prove that the functions $\{ e^{irt} \}$ for $r$ a real are linearly independent. This can be straightforwardly done by computing the Wronskian and in fact shows that in fact the functions $\{ e^{zt} \}$ for $z$ a complex number are linearly independent.
Proof 3. Begins the same as Proof 2, but we do not compute the Wronskian. Instead, let $\sum c_z e^{zt} = 0$ be a nontrivial linear dependence with a minimal number of terms and differentiate to obtain
$$\sum z c_z e^{zt} = 0.$$
If $z_0$ is any complex number such that $z_0 \neq 0$ and $c_{z_0} \neq 0$ (such a number must exist in a nontrivial linear dependence), then
$$\sum (z - z_0) c_z e^{zt} = 0$$
is a linear dependence with a fewer number of terms; contradiction. So there are no nontrivial linear dependences.
Linear dependence doesn't make sense without specifying what the scalars are. If you're allowed to use coefficients that are, say, continuous functions, then $\{ e^x,e^{2x} \}$ is, in fact, linearly dependent, by your very argument: you have a nonzero linear combination
$$ e^x \cdot e^x - 1 \cdot e^{2x} = 0 $$
giving zero.
However, if scalars are restricted to being just real numbers, then the linear combination above doesn't work to show dependence, because $e^x$ is not a scalar.
There is a simplification and an abuse of notation going on here that may be confusing you. Strictly speaking, $e^x$ is a real number (that varies depending on $x$), but the question intends to ask about a function.
Let me write $f$, $g$, $h$, and $k$ for the four functions defined by
$$ f(x) = e^{2x} \qquad g(x) = e^x \qquad h(x) = 1 \qquad k(x) = 0$$
The question is asking to show that $\{ f, g \}$ is a linearly independent set. I assume we are in the case that scalars are real numbers. So, the question is whether or not there not exist scalars $a,b$, such that
$$af + bg = k $$
(note that $k$ is the zero vector) Now, an equation of functions holds if and only if it holds for all values — so the problem is equivalent to asking of there are scalars $a$ and a $b$ such that, for every $x$, we have
$$ a f(x) + b g(x) = k(x) $$
or equivalently,
$$ a e^{2x} + b e^x = 0 $$
Since you can't find particular scalars $a$ and $b$ that make this equation true for all $x$, the functions are independent.
However, if we take the question literally without recognizing the intended abuse of notation, it is correct to say that the (variable) set of real numbers $\{ e^{2x}, e^x \}$ is linearly dependent (for all values of $x$), by the argument you gave. The thing we need to show has the quantification the other way around: the problem is, for each x, to find an $a$ and a $b$.
Best Answer
Setting $x = 0$ in the equation $a_1 + a_2x + a_3x^2 = 0$ results in $a_1 = 0$. Then $a_2x + a_3x^2 = 0$ for all $x\in \Bbb R$. Setting $x = 1$ gives $a_2 + a_3 = 0$, and setting $x = -1$ gives $-a_2 + a_3 = 0$. Solving the system of equations will yield $a_2 = a_3 = 0$.