Yes, this has plenty to do with the derivative. In particular, what you describe is the backwards difference operator, which is just defined as
$$\nabla f(n)=f(n)-f(n-1).$$
This is an operator of interest on its own, but the connection to calculus is that we can consider this as telling us the "average" slope between $n-1$ and $n$.
What you are doing is iterating the operator. In particular, one often writes
$$\nabla^{k+1} f(n)=\nabla^k f(n)-\nabla^k f(n-1)$$
to meant that $\nabla^k f(n)$ is the result of applying this operator $k$ times. For instance, one has that $\nabla^3 n^3 = 6$, as you note. More generally $\nabla^k n^k = k!$, and this lets us recover a polynomial function from its table, which is what you were up to in sixth grade.
However, we can take things further by trying to interpret these numbers - and there is a natural interpretation. For instance, $\nabla^2 f(n)$ represents how quickly $f$ is "accelerating" over the interval $[n-2,n]$, since it tells us about how the average slope changes between the interval $[n-2,n-1]$ and the interval $[n-1,n]$. If we keep going, we get that $\nabla^3 f(n)$ tells us how the acceleration changes between an interval $[n-3,n-2]$ and $[n-2,n]$. We can keep going like this for physical interpretations.
However, this operator has a problem: We'd like to interpret the values as accelerations or as slopes, but $\nabla^k f(n)$ depends on the values of $f$ across the interval $[n-k,n]$. That is, it keeps taking up information from further and further away from the point of interest. The way one fixes this is to try to measure the slope over a smaller distance $h$ rather than measure it over a length of $1$:
$$\nabla_h f(n)=\frac{f(n)-f(n-h)}h$$
which is now the average slope of $f$ between $n-h$ and $n$. So, if we make $h$ smaller, we start to need to know $f$ across a smaller range. This gives better meanings to higher order differences like $\nabla_h^k f(n)$, since now they only depend on a small portion of $f$.
The derivative is just what happens to $\nabla_h$ when you send $h$ to $0$. It captures only local information about the function - so, it captures instantaneous slope or instantaneous acceleration and so on. In particular, one can work out that $\nabla f(n)$ is just the average of the derivative over the interval $[n-1,n]$. One can also work out that $\nabla^2 f(n)$ is a weighted average* of the second derivative over the interval $[n-2,n]$ and $\nabla^3 f(n)$ is another weighted average of the third derivative over $[n-3,n]$.
In particular, if the $k^{th}$ derivative is constant, then it coincides with $\nabla^k f(n)$. One can also find results that if the $k^{th}$ derivative is linear, then $\nabla^k f(n)$ differs from it by at worst a constant. In particular, $\nabla$ is good at capture "global" effects (like the highest order term in a polynomial and its coefficient) but bad at capturing "local" effects (like instantaneous changes in the slope). So, in some sense, $\nabla$ is just a rough approximation of the derivative, and has similar interpretations, just doesn't work nearly as cleanly.
(*Unfortunately, "weighted average" here is hard to explain rigorously without calculus. For the benefit of readers with more background, I really mean "convolution" assuming that $f$ is actually differentiable enough times for any of this to make sense)
An asymptote (horizontal or vertical) occurs when a line fits the curve at infinity.
$$\lim_{x\to\infty}(f(x)-(ax+b))=0.$$
The angular coefficient can be found by writing
$$\lim_{x\to\infty}\frac{f(x)-(ax+b)}x=0,$$
or
$$\lim_{x\to\infty}\frac{f(x)}x=a,$$ if that limit exists, and the intercept from
$$\lim_{x\to\infty}(f(x)-ax)=b,$$ if that limit exists.
The first limit can also be evaluated by the L'Hospital rule (provided its conditions of application are fulfilled):
$$a=\lim_{x\to\infty}\frac{f(x)}x=\lim_{x\to\infty}\frac{f'(x)}1.$$
This is how the first derivative enters into play. For this reason, one also says that an asymptote is a tangent at infinity. In other words, there is an asymptote if the tangent to the curve tends to a particular straight line for $x$ going to infinity.
$a$ exists if the function doesn't grow faster than $x$, i.e. if the slope of the function is bounded and convergent (to zero in the horizontal case). This is a necessary but not sufficient condition. Then $b$ exists if the difference between the function and the approximation $f(x)-ax$ is itself bounded and convergent.
A few examples:
$f(x)=x^2$ has no asymptote because $\dfrac{x^2}x$ is unbounded (so is $f'(x)=2x$).
$f(x)=x\sin(x)$ has no asymptote because $\dfrac{x\sin x}x$ does not converge (nor does $\sin x+x\cos x$).
$f(x)=\sqrt x$ has no asymptote, because $a=0$ is true, but $b$ does not exist. In fact, $\lim_{x\to\infty}(\sqrt x-0x)=\infty$, and one can say that the asymptote is a line "at infinity".
$f(x)=\sqrt{x^2+1}$ has an asymptote because $a=1$ holds (both from $\dfrac{f(x)}x$ and $f'(x)$), and $\lim_{x\to\infty}\left(\sqrt{x^2+1}-x\right)=\dfrac12$.
Notice that in the above discussion, the higher order derivatives are never used. In particular, their respective signs are playing no role. For example, $\dfrac{\sin x}x$ has an horizontal asymptote, while all its derivatives are alternating.
So the answer to your title question is "they don't".
Best Answer
It turns out there are two separate issues to consider.
In functional notation, derivatives are things that are applied to functions, not variables. The derivative of a univariate function (i.e. a function with one argument) is always the derivative of the value of the function with respect to the argument of the function.
i.e. if $f$ is the function defined by $f(x) = x^2$, then $f'$ is the function defined by $f'(z) = 2z$.
In the equations above, $x$ and $z$ are dummy variables; they have no meaning on their own, and only purpose in existence is to let us write down an equation for the value of $f$ at a point.
In dependent variable notation, the variables you use all have some intrinsic meaning. (e.g. you might use $t$ to refer to "time"). You can't differentiate variables, but you can take their differentials. The differential of $x$ is $dx$. The differential of $x^2$ is $d(x^2) = 2x~dx$.
Sometimes, two differentials can be proportional. For example, if $x$ and $t$ are dependent one one either via the equation $x = t^2 + 1$, then this equation also holds when we compute the differential on both sides: $dx = 2t~dt$.
In Leibniz notation, when we have such a proportion, we use $dx/dt$ to express the ratio. So if $dx = 2t~dt$, then we say $dx/dt = 2t$. And $dt/dx = 1/(2t)$.
If the relationship between $x$ and $t$ is $x = f(t)$, then fortunately we have $dx = f'(t) dt$, and so in Leibniz notation, $dx/dt = f'(t)$.
If we have two equations, such as
$$ \frac{8.5}{10-x} = \frac{1.5}{y} $$
and
$$ x = 2.2t $$
then we can get two equations between the differentials. Let me first simplify the first equation to
$$ \frac{10-x}{8.5} = \frac{y}{1.5} $$
Now, when we take the differential, we get two equations
$$ dx = 2.2~dt $$ $$ -\frac{1}{8.5} dx = \frac{1}{1.5} dy $$
and if we wanted, we can solve the first for $dx$ and plug it into the second:
$$ -\frac{2.2}{8.5} dt = \frac{1}{1.5} dy $$
We can't always write differentials as proportions. e.g. if $A = xy$, then $dA = x dy + y dx$. If $x$ and $y$ aren't functionally related to each other, then $dA/dx$ and $dA/dy$ simply don't make sense.