Yes. Cauchy's formula states that $$f(z) = \frac{1}{2\pi} \int_{\theta} f(z + re^{i \theta}) d\theta$$
which gives an expression for $f(z)$ in terms of the average around a circle.
By integration it follows that
$$ f(z) = \frac{1}{r_0^2 \pi} \int_{r \leq r_0, \theta} f(z+re^{i \theta}) r dr d\theta$$
which implies that if the square integral of $f$ is bounded, then (by Cauchy-Schwarz) $f$ is locally bounded by a bound depending on the square integral of $f$ and the circle in question. From this your claim follows.
Basically, the point is that $f$ is the average of its values in a neighborhood. Note that this implies that the space of holomorphic functions in an open set $U$ such that $\int_{U} |f|^2$ is finite is actually a Hilbert space (i.e., complete) under the usual inner product.
If $w = f(z) = \sigma +i\gamma$ is holomorphic where $\sigma$ and $\gamma$ are (real-valued) harmonic functions on a region $D$ in $\mathbb{C}$, then $w^2$ will also be holomorphic on $D$. But $w^2 = \sigma^2 - \gamma^2 +2i\sigma\gamma$, and the real and imaginary parts of any holomorpic function are certainly harmonic. Thus we can conclude in this case that $\sigma\gamma$ is a (real) harmonic function in $D$.
Additionally, (changing your notation slightly), to show that $\sigma\gamma$ being harmonic does not necessarily imply that $\sigma + i\gamma$ is holomorphic in $D$, consider $\sigma(x,y) = x$ and $\gamma(x,y)=-y$. Both of these are harmonic, and $\sigma\gamma = -xy$ is also harmonic, as can be seen by finding $\Delta \sigma, \Delta\gamma, \Delta(\sigma\gamma)$. However $\sigma+i\gamma = x-iy$ which is a standard example of a non-analytic function.
For the conditions for $\sigma\gamma$ to be harmonic, given that $\sigma$ and $\gamma$ are harmonic without using Maple, you can do the following (I am simply expanding the answer by user64494 here!):
$$\frac{\partial}{\partial x}(\sigma\gamma) = \frac{\partial\sigma}{\partial x}\gamma+\sigma\frac{\partial\gamma}{\partial x}$$
so that
$$\frac{\partial^2}{\partial x^2}(\sigma\gamma) = \frac{\partial^2\sigma}{\partial x^2}\gamma+\sigma\frac{\partial^2\gamma}{\partial x2} + 2\frac{\partial\sigma}{\partial x}\frac{\partial\gamma}{\partial x}$$
Now do the same for $y$ and add the two results and use the fact that $\sigma$ and $\gamma$ are harmonic to get the result that $\sigma\gamma$ will be harmonic if
$$\frac{\partial\sigma}{\partial x}\frac{\partial\gamma}{\partial x} + \frac{\partial\sigma}{\partial y}\frac{\partial\gamma}{\partial y} = 0$$
Best Answer
First, $g$ can only have a finite number of zeros say $a_1,\ldots, a_n$. More importantly $g(z)=(z-a_1)\cdots(z-a_n)\cdot h$ where $h$ is some entire nonzero function. But then $|\frac{f(z)}{h(z)}|<|(z-a_1)\cdots(z-a_n)|$ for every $|z|>1$. That is $\frac{f}{h}$ is bounded by a polynomial. Thus, $\frac{f}{h}$ is a polynomial, which shows both that $h$ must be a constant and that $f$ is a polynomial. Thus both $f$ and $g$ are polynomials.
$\hspace{1in}$
Theorem. Let $f$ be an entire function and $n\in\mathbb{N}$ such that $|f|\le |z|^n$. Then $f$ must be a polynomial.
Proof. Let $f(z)=\sum_0^\infty a_k z^k $, we will show that the $a_k$'s must be zero eventually; they will in fact be zero when $k>n$. We perform the following calculation ($n$ is fixed, $m$ is arbitrary):
$\displaystyle |a_{n+m}|=\left|\frac{f^{(n+m)}(0)}{(n+m)!}\right|=\lim_{R\to\infty} \left|\frac{1}{2\pi i} \int_{B(0,R)} \frac{f(z)}{z^{n+m+1}} dz\right|\le\lim_{R\to\infty} \frac{1}{2\pi} \underset{z\in B(0,R)}{max}\ \left|\frac{f(z)}{z^{n+m+1}} \right|\cdot 2\pi R \le\lim_{R\to\infty} \underset{z\in B(0,R)}{max}\frac{|z|^n}{|z|^{n+m+1}} \cdot R \le\lim_{R\to\infty} \frac{R^n}{R^{n+m+1}} \cdot R = 0$
Thus, $a_k=0$ for $k>n$ and so $f$ must be a polynomial.
For the general result notice that any polynomial $|p|\le M\cdot |z|^n$ for some $M\in\mathbb{R}$ and $n\in\mathbb{N}$.
Interestingly, using the real or imaginary version of cauchy integral formula for taylor coefficients, we can extend this result to the real or imaginary part of a function. That is if $f=u+vi$ and $|u|\le |p|$ for some polynomial $p$ then $f$ must a polynomial. We can even (shockingly) drop the absolute values.
Theorem (Markushevich - Volume 2 - Page 265) Let $f=u+vi$. Suppose that $u(z)\le|z|^n$ for some $n\in\mathbb{N}$, then $f$ must be a polynomial.