It is possible that this question is homework but I felt this classical elementary probability question was still lacking a complete answer after several months, so I'll give one here.
From the problem statement, we want the distribution of
$$Y = \max \{ X_1, ..., X_n \}$$
where $X_1, ..., X_n$ are iid ${\rm Uniform}(a,b)$. We know that $Y < x$ if and only if every element of the sample is less than $x$. Then this, as indicated in @varty's hint, combined with the fact that the $X_i$'s are independent, allows us to deduce
$$ P(Y \leq x) = P(X_1 \leq x, ..., X_n \leq x) = \prod_{i=1}^{n} P(X_i \leq x) = F_{X}(x)^n$$
where $F_{X}(x)$ is the CDF of the uniform distribution that is $\frac{y-a}{b-a}$. Therefore the CDF of $Y$ is
$$F_{Y}(y) = P(Y \leq y) = \begin{cases}
0 & y \leq a \\
\phantom{} \left( \frac{y-a}{b-a} \right)^n & y\in(a,b) \\
1 & y \geq b \\
\end{cases}$$
Since $Y$ has an absolutely continuous distribution we can derive its density by differentiating the CDF. Therefore the density of $Y$ is
$$ p_{Y}(y) = \frac{n(y-a)^{n-1}}{(b-a)^{n}}$$
In the special case where $a=0,b=1$, we have that $p_{Y}(y)=ny^{n-1}$, which is the density of a Beta distribution with $\alpha=n$ and $\beta=1$, since ${\rm Beta}(n,1) = \frac{\Gamma(n+1)}{\Gamma(n)\Gamma(1)}=\frac{n!}{(n-1)!} = n$.
As a note, the sequence you get if you were to sort your sample in increasing order - $X_{(1)}, ..., X_{(n)}$ - are called the order statistics. A generalization of this answer is that all order statistics of a ${\rm Uniform}(0,1)$ distributed sample have a Beta distribution, as noted in @bnaul's answer.
What your textbook says is actually not entirely true. If two random variables have the same cumulative distribution function, then their density functions are equal almost everywhere. For instance, a perfectly valid density function for a random variable uniformly distributed over the open interval $(0, 1)$ is a function which equals one over $(0, 1)$ and zero otherwise. Or we could just as easily define the density to equal one at the endpoints if we wanted. Being even more extreme we could have it take on completely arbitrary values on any countable subset of $\mathbb{R}$, the only requirement is that the density functions need to integrate to the same value over sets of the form $(-\infty, a]$. If this is the case then it's clear we'll arrive at the same distribution function.
In general though we take the density to be the derivative of the cumulative distribution function wherever the latter is differentiable and just set it to zero everywhere else.
Best Answer
Problems like this, where you want to differentiate the product of a bunch of functions that depend on your variable of interest, can be dealt with by logarithmic differentiation. Let $\Phi$ and $\phi$ denote the CDF and PDF of the standard normal distribution (respectively). Since the normal random variables in your question have the same variance you get:
$$\prod_{i=1}^n F_i(y) = \prod_{i=1}^n \Phi \Big( \frac{y-\mu_i}{\sigma} \Big) = \exp \Bigg( \sum_{i=1}^n \ln \Phi \Big( \frac{y-\mu_i}{\sigma} \Big) \Bigg).$$
Differentiating with respect to $y$ and applying the chain rule gives:
$$\begin{equation} \begin{aligned} f_Y(y) = \frac{d F_Y}{dy}(y) &= \Bigg( \frac{1}{\sigma} \sum_{i=1}^n \frac{\phi ( (y-\mu_i)/\sigma ) }{\Phi ( (y-\mu_i)/\sigma )} \Bigg) \exp \Bigg( \sum_{i=1}^n \ln \Phi \Big( \frac{y-\mu_i}{\sigma} \Big) \Bigg) \\[6pt] &= \Bigg( \frac{1}{\sigma} \sum_{i=1}^n \frac{\phi ( (y-\mu_i)/\sigma ) }{\Phi ( (y-\mu_i)/\sigma )} \Bigg) \Bigg( \prod_{i=1}^n \Phi \Big( \frac{y-\mu_i}{\sigma} \Big) \Bigg). \\[6pt] \end{aligned} \end{equation}$$
In the special case where $\mu = \mu_1 = \cdots = \mu_n$ this reduces to the well-known formula:
$$f_Y(y) = \frac{n}{\sigma} \cdot \phi \Big( \frac{y-\mu}{\sigma} \Big) \cdot \Phi \Big( \frac{y-\mu}{\sigma} \Big)^{n-1}.$$