Yes, this has plenty to do with the derivative. In particular, what you describe is the backwards difference operator, which is just defined as
$$\nabla f(n)=f(n)-f(n-1).$$
This is an operator of interest on its own, but the connection to calculus is that we can consider this as telling us the "average" slope between $n-1$ and $n$.
What you are doing is iterating the operator. In particular, one often writes
$$\nabla^{k+1} f(n)=\nabla^k f(n)-\nabla^k f(n-1)$$
to meant that $\nabla^k f(n)$ is the result of applying this operator $k$ times. For instance, one has that $\nabla^3 n^3 = 6$, as you note. More generally $\nabla^k n^k = k!$, and this lets us recover a polynomial function from its table, which is what you were up to in sixth grade.
However, we can take things further by trying to interpret these numbers - and there is a natural interpretation. For instance, $\nabla^2 f(n)$ represents how quickly $f$ is "accelerating" over the interval $[n-2,n]$, since it tells us about how the average slope changes between the interval $[n-2,n-1]$ and the interval $[n-1,n]$. If we keep going, we get that $\nabla^3 f(n)$ tells us how the acceleration changes between an interval $[n-3,n-2]$ and $[n-2,n]$. We can keep going like this for physical interpretations.
However, this operator has a problem: We'd like to interpret the values as accelerations or as slopes, but $\nabla^k f(n)$ depends on the values of $f$ across the interval $[n-k,n]$. That is, it keeps taking up information from further and further away from the point of interest. The way one fixes this is to try to measure the slope over a smaller distance $h$ rather than measure it over a length of $1$:
$$\nabla_h f(n)=\frac{f(n)-f(n-h)}h$$
which is now the average slope of $f$ between $n-h$ and $n$. So, if we make $h$ smaller, we start to need to know $f$ across a smaller range. This gives better meanings to higher order differences like $\nabla_h^k f(n)$, since now they only depend on a small portion of $f$.
The derivative is just what happens to $\nabla_h$ when you send $h$ to $0$. It captures only local information about the function - so, it captures instantaneous slope or instantaneous acceleration and so on. In particular, one can work out that $\nabla f(n)$ is just the average of the derivative over the interval $[n-1,n]$. One can also work out that $\nabla^2 f(n)$ is a weighted average* of the second derivative over the interval $[n-2,n]$ and $\nabla^3 f(n)$ is another weighted average of the third derivative over $[n-3,n]$.
In particular, if the $k^{th}$ derivative is constant, then it coincides with $\nabla^k f(n)$. One can also find results that if the $k^{th}$ derivative is linear, then $\nabla^k f(n)$ differs from it by at worst a constant. In particular, $\nabla$ is good at capture "global" effects (like the highest order term in a polynomial and its coefficient) but bad at capturing "local" effects (like instantaneous changes in the slope). So, in some sense, $\nabla$ is just a rough approximation of the derivative, and has similar interpretations, just doesn't work nearly as cleanly.
(*Unfortunately, "weighted average" here is hard to explain rigorously without calculus. For the benefit of readers with more background, I really mean "convolution" assuming that $f$ is actually differentiable enough times for any of this to make sense)
Best Answer
The $x$-intercept occurs when the curve touches the $x$-axis, i.e. when $y=0$. Therefore, finding the $x$-intercept is the same as finding the root of the function $h(x)=3\ln(x)-9$: \begin{align} 3\ln(x) - 9 &= 0 \\[4pt] \ln(x) &= 3 \end{align} Can you do the rest?