In a very natural sense, you can! If $\lim_{x \to \infty} f(x) = \lim_{x \to -\infty} f(x) = L$ is some real number, then it makes sense to define $f(\infty) = L$, where we identify $\infty$ and $-\infty$ in something called the one-point compactification of the real numbers (making it look like a circle).
In that case, $f'(\infty)$ can be defined as
$$f'(\infty) = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).$$
When you learn something about analytic functions and Taylor series, it will be helpful to notice that this is the same as differentiating $f(1/x)$ at zero.
Notice that this is actually not the same as $\lim_{x \to \infty} f'(x)$.
These ideas actually show up quite a bit in analytic capacity, so this is a rather nice idea to have.
I wanted to expand this answer a bit to give some explanation about why this is the "correct" generalization of differentiation at infinity. and hopefully address some points raised in the comments.
Although $\lim_{x \to \infty} f'(x)$ might feel like the natural object to study, it is quite badly behaved. There are functions which decay very quickly to zero and have horizontal asymptotes, but where $f'$ is unbounded as we tend to infinity; consider something like $\sin(x^a) / x^b$ for various $a, b$. Furthermore, $\lim_{x \to \infty} f'(x) = 0$ is not sufficient to guarantee a horizontal asymptote, as $\sqrt{x}$ shows.
So why should we consider the definition I proposed above? Consider the natural change of variables interchanging zero and infinity*, swapping $x$ and $1/x$. Then if $g(x) := f(1/x)$ we have the relationship
$$\lim_{x \to 0} \frac{g(x) - g(0)}{x} = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).$$
That is to say, $g'(0) = f'(\infty)$. Now via this change of variables, neighborhoods of zero for $g$ correspond to neighborhoods of $\infty$ for $f$. So if we think of the derivative as a measure of local variation, we now have something that actually plays the correct role.
Finally, we can see from this that this definition of $f'(\infty)$ gives the coefficient $a_1$ in the Laurent series $\sum_{i \ge 0} a_i x^{-i}$ of $f$. Again, this corresponds to our idea of what the derivative really is.
* This is one of the reasons why I used the one-point compactification above. Otherwise, everything that follows must be a one-sided limit or a one-sided derivative.
A function may or may not be continuous.
If it is continuous, it may or may not be differentiable. $f(x) = |x|$ is a standard example of a function which is continuous, but not (everywhere) differentiable. However, any differentiable function is necessarily continuous.
If a function is differentiable, its derivative may or may not be continuous. This is a bit more subtle, and the standard example of a differentiable function with discontinuous derivative is a bit more complicated:
$$
f(x) = \cases{x^2\sin(1/x) & if $x\neq 0$\\
0 & if $x = 0$}
$$
It is differentiable everywhere, $f'(0) = 0$, but $f'(x)$ oscillates wildly between (a little less than) $-1$ and (a little more than) $1$ as $x$ comes closer and closer to $0$, so it isn't continuous.
Best Answer
This "lateral derivative" approach is already taken in many analysis texts. Royden's analysis text does this, for example. (I though that I have seen them referred to as left and right "derivates", but I can't seem to come up with a source...)
By the way, differentiability of a function on its domain always implies it is continuous there. This works for "left differentiability implies left continuity" and on the right hand too. That is not contradicting your example, however!
If you "ignore the right half of the plane", you are effectively changing the domain to be only the half you are focused on, and indeed, you defined it to be continuous there.