Of course $\hat\theta=\max\{x_i\}$ is biased, simulations or not, since $\hat\theta<\theta$ with full probability hence one can be sure that, for every $\theta$, $E_\theta(\hat\theta)<\theta$.
One usually rather considers $\hat\theta_N=\frac{N+1}N\cdot\max\{x_i\}$, then $E_\theta(\hat\theta_N)=\theta$ for every $\theta$.
To show that $\hat\theta_N$ is unbiased, one must compute the expectation of $M_N=\max\{x_i\}$, and the simplest way to do that might be to note that for every $x$ in $(0,\theta)$, $P_\theta(M_N\le x)=(x/\theta)^N$, hence
$$
E_\theta(M_N)=\int\limits_0^{\theta}P_\theta(M_N\ge x)\text{d}x=\int\limits_0^{\theta}(1-(x/\theta)^N)\text{d}x=\theta-\theta/(N+1)=\theta N/(N+1).
$$
Edit (Below is a remark due to @cardinal, which completes nicely this post.)
It may be worth noting that the maximum $M_N=\max\limits_{1\le k\le N}X_k$ of an i.i.d. Uniform$(0,θ)$ random sample is a sufficient statistic for $θ$ and that it is one of the few statistics for distributions outside the exponential family which is also complete.
An immediate consequence is that $\hat\theta_N=(N+1)M_N/N$ is the uniformly minimum variance unbiased estimator (UMVUE) for $θ$, that is, that any other unbiased estimator for $θ$ is a worse estimator in the $L^2$ sense.
I propose the $\hat{\theta}_N=\frac{N+1}{N}\max(X_1,\ldots,X_N)$.
Indeed, let $Y=\max(X_1,\ldots,X_N)$ then
$$\Bbb{P}(Y\leq t)=\left(\frac{t}{\theta}\right)^N,$$
Hence, the density of $Y$ is given by $f_Y(t)=\frac{N}{\theta^N}t^{N-1}{\bf 1}_{[0,\theta]}(t)$
and
$$\Bbb{E}(\hat\theta_N)=\frac{N+1}{N}\Bbb{E}(Y)=\frac{N+1}{N}\frac{N}{\theta^N}\int_0^\theta t^Ndt=\theta.$$
and we are done.
Best Answer
What you did is wrong. You found that if $c=\sqrt{\dfrac n {n-1}}$ then $c^2 S^2$ is an unbiased estimator of $\sigma^2.$ It does not follow that $cS$ is an unbiased estimator of $\sigma.$ The expected value of a square root of a random variable is not the same as the square root of the expected value of the random variable.
Here I will use the fact that $$ \frac 1 {\sigma^2} \sum_{i=1}^n (X_i - \overline X)^2 \sim \chi^2_{n-1}. $$ This has been proved a number of times in answers posted here.
This probability distribution is $$ \frac 1 {\Gamma(\frac{n-1}2)} \left( \frac x 2 \right)^{(n-1)/2-1} e^{-x/2} \, \frac{dx} 2 \quad \text{for } x>0. $$ So you need the expected value of the square root of a random variable with this distribution. That expected value is \begin{align} & \int_0^\infty \sqrt x \frac 1 {\Gamma(\frac{n-1}2)} \left( \frac x 2 \right)^{(n-1)/2-1} e^{-x/2} \, \frac{dx} 2 \\[10pt] = {} & \sqrt 2 \int_0^\infty \sqrt{\frac x 2} \cdot \frac 1 {\Gamma(\frac{n-1}2)} \left( \frac x 2 \right)^{(n-1)/2-1} e^{-x/2} \, \frac{dx} 2 \\[10pt] = {} & \sqrt 2 \cdot \frac 1 {\Gamma\left( \frac{n-1} 2 \right)} \int_0^\infty \sqrt u \cdot u^{(n-1)/2-1} e^{-u} \, du \\[8pt] = {} & \sqrt 2 \cdot \frac 1 {\Gamma\left( \frac{n-1} 2 \right)} \int_0^\infty u^{(n/2) - 1} e^{-u} \, du = \sqrt 2 \cdot \frac {\Gamma\left( \frac n 2 \right)} {\Gamma\left( \frac{n-1} 2 \right)}. \end{align} Now apply some identities involving the gamma function including the fact that $\Gamma\left( \frac 1 2 \right) = \sqrt\pi.$