Newton's method applied directly on the optimality conditions may not converge because you need a merit function and a globalization mechanism (a line search or trust regions), but it's the right approach. Look up SQP methods in any good optimization textbook. In essence, what you'll do is write the KKT conditions
$$
\nabla f(x) + J(x)^T y = 0, \quad g(x) = 0,
$$
(where $J$ is the Jacobian of $g$), apply Newton's method:
$$
\begin{bmatrix}
H(x,y) & J(x)^T \\
J(x) & 0
\end{bmatrix}
\begin{bmatrix}
\Delta x \\ \Delta y
\end{bmatrix}
=
-
\begin{bmatrix}
\nabla f(x) + J(x)^T y \\ g(x)
\end{bmatrix},
$$
and perform, say, a line search for a specific merit function. There are a few tricks, but this is the heart of the method and it's often very successful. You could implement your own toy version of this, but I'd recommend using a solid implementation. For instance, IPOPT will reduce to the above if your problem only has equality constraints.
$$\begin{array}{ll} \text{minimize} & f(x) + g(y)\\ \text{subject to} & xy \ge a\\ & x \ge 0\\ & y \ge 0\end{array}$$
where both $f$ and $g$ are convex quadratic functions and $a > 0$. The feasible region is convex and, since $a > 0$, also LMI-representable
$$\{ (x,y) \in \mathbb R^2 : x \geq 0 \land y \geq 0 \land x y \geq a \} = \left\{ (x,y) \in \mathbb R^2 : \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2 \right\}$$
Hence, the original optimization problem can be rewritten as follows
$$\begin{array}{ll} \text{minimize} & f(x) + g(y)\\ \text{subject to} & \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2\end{array}$$
Introducing optimization variables $s, t \in \mathbb R$, we rewrite the optimization problem in epigraph form
$$\begin{array}{ll} \text{minimize} & s + t\\ \text{subject to} & f(x) \leq s\\ & g(y) \leq t\\ & \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2\end{array}$$
Let $f$ and $g$ be
$$f (x) := f_0 + f_1 x + f_2 x^2 \qquad\qquad\qquad g (y) := g_0 + g_1 y + g_2 y^2$$
where $f_2, g_2 > 0$ (to ensure convexity). Inequality constraints $f(x) \leq s$ and $g(y) \leq t$ can be written in LMI form, as follows
$$\begin{bmatrix} 1 & \sqrt{f_2} \, x\\ \sqrt{f_2} \, x & s - f_0 - f_1 x\end{bmatrix} \succeq \mathrm O_2$$
$$\begin{bmatrix} 1 & \sqrt{g_2} \, y\\ \sqrt{g_2} \, y & t - g_0 - g_1 y\end{bmatrix} \succeq \mathrm O_2$$
These LMIs introduce inequalities $s - f_0 - f_1 x \geq 0$ and $t - g_0 - g_1 y \geq 0$, which are redundant. Note that lines $s = f_0 + f_1 x$ and $t = g_0 + g_1 y$ are tangent to the graphs of $f$ and $g$, respectively.
Hence, we obtain a semidefinite program (SDP) in variables $x, y, s, t \in \mathbb R$
$$\begin{array}{lc} \\ \text{minimize} & s + t\\\\ \text{subject to} & \begin{bmatrix} 1 & \sqrt{f_2} \, x & & & \\ \sqrt{f_2} \, x & s - f_0 - f_1 x & & & \\ & & 1 & \sqrt{g_2} \, y & \\ & & \sqrt{g_2} \, y & t - g_0 - g_1 y & \\ & & & & x & \sqrt{a}\\ & & & & \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_6\\\\\end{array}$$
which can be solved numerically using any SDP solver.
Best Answer
The methods to be used will be highly dependent on the character of $f$. If it is non-convex, there are many such algorithms; see this MO post for instance.
You might also wish to look into evolutionary algorithms, such as genetic algorithms and simulated annealing. These algorithms are often much slower, but have the feature that they can sometimes "bump" you out of local extrema. They are also fairly easy to implement.
You can also hybridize approaches: combine an evolutionary algorithm with a standard convex optimization approach on a locally convex subdomain.
Finally, with an equality constraint, you essentially reduce the dimensionality of your problem by 1. That is, one variable is completely determined by the others.
$$x_k = B - \sum_{i=1,\ i\neq k}^n x_i.$$
And then, depending on the character of your function, you might be able to use any sort of algorithm.
But to answer your questions, 1.) there are many common and smart ways, but they depend on your function $f$. I would start with the most simple, conventional approach, and then see whether it is effective. 2.) I don't think you're missing anything. Numerical optimization is a big topic and there are many ways to go about it.