To find $u_n$, apply $u_{k}=1.5u_{k-1}+1$ for all $k$ from $n$ to $1$, multiplied by appropriate powers of $1.5$:
$u_n-1.5u_{n-1}=1$
$1.5u_{n-1}-1.5^2u_{n-2}=1.5$
$1.5^2u_{n-2}-1.5^3u_{n-3}=1.5^2$
.
.
. (these dots mean do the same thing for $k$ from $n-3$ to $3$ ) ...
$1.5^{n-2}u_2-1.5^{n-1}u_1=1.5^{n-2}$
$1.5^{n-1}u_1-1.5^nu_0=1.5^{n-1}$.
Adding these up (telescoping),
$u_n-1.5^nu_0=1+1.5+1.5^2+\dots+1.5^{n-2}+1.5^{n-1} .$
$u_n=1+1.5+1.5^2+\cdots+1.5^{n-2}+1.5^{n-1}+1.5^n=\dfrac{1.5^{n+1}-1}{1.5-1}.$
The below uses some special function theory to prove
$$ u_n:=\sum_{k=0}^n \binom{n}{k}^2\binom{n+k}{k}^2 \sim A\, (n+1/2)^{-3/2}\Big(17+12\sqrt{2}\Big)^n $$
where $A=(2\,\sqrt{2}\,\pi)^{-3/2}(3+2\,\sqrt{2})$ and $17+12\sqrt{2}$ is
indeed the larger root of the polynomial $x^2 - 34\,x + 1=0$, as stated in the problem.
Start with the Legendre polynomial expression
$$ P_n(1+2x) = \sum_{k=0}^n \binom{n}{k} \binom{n+k}{k} x^k $$
Use the Hadamard product (I prove it in MSE 3526636) to get
$$u_n=\frac{1}{2\pi}\int_0^{2\pi} P_n(1+2e^{i\,\theta}) P_n(1+2e^{-i\,\theta})\, d\theta$$
Now use the integral relationship for a product of Legendre polynomials
$$P_n(z)\,P_n(w) = \frac{1}{\pi} \int_0^\pi P_n(z\,w + \sqrt{(1-z^2)(1-w^2)}\,\cos(\phi) )\, d\phi $$
Insert this integral into the penultimate equation, do some algebra, and use a symmetry argument for the outer integral to conclude
$$ u_n = \frac{1}{\pi^2} \int_0^\pi d\theta \int_0^\pi d\phi \, P_n(5+4 \, \cos{\theta}+8\cos{(\theta/2)}\,\cos{\phi} ) $$
This double integral is that which will have performed on it asymptotic analysis. Note that the argument of the $P_n(\cdot)$ is $\ge$ 1. Thus the well-known asymptotic expansion next should be used.
$$ P_n(\cosh{t}) \sim \Big(\frac{t}{\sinh(t)}\Big)^{1/2} I_0\big((n+1/2)t\big)$$
For our problem, $t=\cosh^{-1}(5+4 \, \cos{\theta}+8\cos{\theta}\,\cos{\phi})$ (the inverse hyperbolic cosine).
$I_0\big(z\big)$ is the modified Bessel function and has for real positive argument the asymptotic expansion (first term only)
$$ I_0\big(z\big) \sim \frac{ \exp{(z)} }{\sqrt{2 \pi z}}\,. $$
This expansion is very useful. Looking at plots, it appears that the integrand (over $\theta$) is peaked at the origin. As with many asymptotic problems, you'd like to get an integrand that looks like a gaussian, get the amplitude and width of the gaussian, extend the limits from a finite range ($\pi$ in our case) to $\infty$ so that an analytic result can be obtained. That's exactly the strategy, but in 2D.
The factor raised to the 1/2 power in the penultimate equation is slowly varying, so we bring it outside the integrals, but evaluate it $\theta=0$ and $\phi=0$ (in big $\big(\cdot\big)$ below).
Thus,
$$ u_n \sim \frac{1}{\pi^2} \frac{1}{\sqrt{2\pi(n+1/2)}}\big( \frac{2^{-5/4}}{\sqrt{3}}\big) \cdot $$
$$ \cdot \int_0^\pi d\theta \int_0^\pi d\phi \,
\exp{\big((n+1/2)\cosh^{-1}(5+4\,\cos{\theta} + 8\cos{(\theta/2)} \cos{\phi} )\big) } $$
Expand the argument of the exponential to order $\theta^2$ and $\phi^2.$ Extend the limits of the integrals to $\infty.$ Do the integrals explicitly. One finds that
$$u_n \sim \frac{2^{-7/4}}{\sqrt{3(n+1/2)}\pi^{5/2}} \int_0^\infty d\theta \int_0^\infty d\phi \cdot$$
$$\cdot \exp{\big( (n+1/2)(\cosh^{-1}(17) - \frac{ \phi^2}{3\sqrt{2}}
- \frac{ \theta^2}{4\sqrt{2}}-5\frac{\theta^2 \phi^2 }{288 \sqrt{2} } \big)}
$$
$$ = \sqrt{\frac{3}{5}}\frac{\exp{\big( (n+1/2)(\cosh^{-1}(17)
+ 6\sqrt{2}/5 ) \big)}}{2 \pi^2 (n+1/2)} K_0\big( (n+1/2)6\sqrt{2}/5 \big) $$
where $K_0(\cdot)$ is the Macdonald function.
Use the asymptotic formula
$$ K_0\big(z) \sim e^{-z}\sqrt{ \frac{\pi}{2z} } .$$
Algebra (including hyperboic trig ID's) completes the proof.
That the Legendre polynomials are used is actually quite natural. One proof of
the irrationality of $\zeta(3)$ uses them explicitly.
For comparison: with $n=30$, the difference between the sum and its asymptotic approximation is about 1%. For $n=301,$ about 0.1%.
Best Answer
It's not correct. A counterexample: $$x_n = \frac43x_{n-1}-\frac43x_{n-2}\quad\text{or}\quad x_n = -\frac43x_{n-1}-\frac43x_{n-2}$$ $$x_n = 1,2,-4,-8,16,32,-64,\dots$$ The sequence grows as $2^n$. The roots of the recurrences are $\dfrac{\pm 2\pm 2\sqrt{2}i}{3}$, each of absolute value $\dfrac{2}{\sqrt{3}} < 2$.
Now, your application to complexity problems should come with some positivity requirements, which would rule out this example. It might be enough to salvage things, but I can't say without further study.
OK, I've come back to it. Assume the sequence $x_n$ is positive (for $n\ge 0$) and the recurrences $x_n^{(j)}=c_1^{(j)}x_{n-1}+c_2^{(j)}x_{n-2}+\cdots+c_{k}^{(j)}x_{n-j}$ have nonnegative coefficients $c_i^{(j)}$.
In this form, each of the characteristic equations $t^k-c_1^{(j)}t^{k-1}-c_2^{(j)}t^{k-2}-\cdots - c_k^{(j)}$ has exactly one positive real root $b_j$, and this $b_j$ has the largest absolute value of any root. Let $B$ be the maximum of the $b_j$. Now, normalize: let $y_n = B^{-n}x_n$, and transform the recurrences to fit the new sequences. The new recurrences are \begin{align*}y_n^{(j)} &= B^{-1}c_1^{(j)}y_{n-1}+B^{-2}c_2^{(j)}y_{n-2}+\cdots+B^{-k}c_k^{(j)}y_{n-k}\\ y_n^{(j)} &= d_1^{(j)}y_{n-1}+d_2^{(j)}y_{n-2}+\cdots+d_k^{(j)}y_{n-k}\end{align*} with $y_n\in \{y_n^{(j)}\}$, naming the new coefficients $d_i^{(j)}$. What's the point? Since $t^k-c_1^{(j)}t^{k-1}-c_2^{(j)}t^{k-2}-\cdots-c_k^{(j)} \ge 0$ for $t\ge b_j$, it's nonnegative for $t = B$ and all $j$. After scaling by those powers of $B$, $$1-d_1^{(j)}-d_2^{(j)}-\cdots-d_k^{(j)}\ge 0$$ for all $j$. Consequently, if each of $y_{n-1},y_{n-2},\dots,y_{n-k}$ is $\le P$ for some constant $P$, so is $y_n^{(j)}$. If we take enough initial conditions to cover all of the recurrences and let $P$ be the maximum of those initial conditions, then, by induction, $y_n\le P$ for all $n$.
Multiply by $B^n$ to go back to the $x_n$, and that's it: $x_n\le PB^n$, where $P$ is some constant depending on initial conditions and $B$ is the largest root of the characteristic equations for the recurrences.