How can $x=1$ be a solution to this equation

logarithmsquadratics

This is the equation I am supposed to solve:
$$\log_{3}(2x^{2} – x-1)\ -\ \log_{3}(\ x-1)\ = 2$$
The textbook gives the solutions, $x=1, x=4$, with the working out as shown below:

Working out of equation

While I understand what they are doing, I don't completely understand why they work it out as they do. Specifically, why isn't the term on the left side in the second row simplified? In my working out, $$\log_{3}\left(\frac{2x^{2} – x-1}{x-1}\right) \implies \log_{3}\left(\frac{(2x+1)(x-1)}{x-1}\right) \implies \log_{3}(2x+1)$$
This equals $2$, allowing it to be simplified further,
\begin{align*}
\: & \log_{3}(2x+1) = 2 \\
\implies \: & 2x+1 = 3^2 = 9 \\
\implies \: & 2x = 8 \\
\implies \: & x = 4 \\
\end{align*}

What's more, $x$ can't even equal $1$, because if it does, both logs simplify to $log(0)$, which is undefined. I am wondering if there is some other concept I am not aware of, or is the book making a mistake?

Best Answer

$x=1$ cannot be a solution because the equation is undefined for $x=1$. The equation in reals is only meaningful if $x-1 > 0$ and $2x^2-x-1>0$, which means $x>1$. In that case one may rewrite the equation as follows:

$$\begin{align} \log_3(2x^2-x-1) - \log_3(x-1) &= \log_3\big((2x+1)(x-1)\big) - \log_3(x-1) \\ &= \log_3(2x+1) + \log_3(x-1) - \log_3(x-1) \\ &= \log_3(2x+1) \\ &\stackrel!= 2 \end{align}$$ and thus $2x+1=9$ which has only one solution $x=4$.

As you see, the culprit is not a division by $x-1$ as clained in the comments, because it's still invalid if no division is present like above.