[Math] Show that 2n “1” digits subtract n “2” digits is a perfect square.

elementary-number-theorysequences-and-series

I came across a difficult question in my IB HL Math book while revising for my Sequences and Series test that I wanted to share. I can't seem to figure it out.

The Question

Here's how I approached it so far:

  1. I found a series to generate 2n "1" digits and n "2" digits respectively:
    $$
    \sum_{k=1}^{2n}10^{k-1}
    $$
    $$
    \sum_{k=1}^{n}2\cdot10^{k-1}
    $$

Then, I used the sum formula:

$$
S_{n} = \frac{U_{1}(1-r^{n})}{1 – r}
$$
For generating n digits of 1:
$$
S_{2n} = \frac{1(1-10^{2n})}{-9}
$$
For generating n digits of 2:
$$
S_{n} = \frac{2(1-10^{n})}{-9}
$$

At this point, the question asks you to show that, when you subtract these terms, you are left with a perfect square.

If you do subtract these algebraically, you are left with:
$$
\frac{1+10^{2n} – 2\cdot10^{n}}{9}
$$

What I thought of doing was attempting to get the square root of this. The denominator leaves 3, but there doesn't seem to be a way to simplify the square root of the numerator expression and show that it's an integer.

Can somebody help me on a method to approach this?

Thanks!

Best Answer

Hint: You are almost done. Observe that $$1 + 10^{2n} - 2\cdot10^n = (10^n - 1)^2.$$

Related Question