Just an initial remark, if you want computational speed you usually have to sacrifice accuracy. "More accuracy" = "More time" in general. Anyways here is a second order approximation, should improve on the "crude" approx you suggested in your comment above:
$$E\Bigg(\frac{X_{j}}{\sum_{i}X_{i}}\Bigg)\approx
\frac{E[X_{j}]}{E[\sum_{i}X_{i}]}
-\frac{cov[\sum_{i}X_{i},X_{j}]}{E[\sum_{i}X_{i}]^2}
+\frac{E[X_{j}]}{E[\sum_{i}X_{i}]^3} Var[\sum_{i}X_{i}]
$$
$$= \frac{\alpha_{j}}{\sum_{i} \frac{\beta_{j}}{\beta_{i}}\alpha_{i}}\times\Bigg[1 - \frac{1}{\Bigg(\sum_{i} \frac{\beta_{j}}{\beta_{i}}\alpha_{i}\Bigg)}
+ \frac{1}{\Bigg(\sum_{i} \frac{\alpha_{i}}{\beta_{i}}\Bigg)^2}\Bigg(\sum_{i} \frac{\alpha_{i}}{\beta_{i}^2}\Bigg)\Bigg]
$$
EDIT An explanation for the above expansion was requested. The short answer is wikipedia. The long answer is given below.
write $f(x,y)=\frac{x}{y}$. Now we need all the "second order" derivatives of $f$. The first order derivatives will "cancel" because they will all involve multiples $X-E(X)$ and $Y-E(Y)$ which are both zero when taking expectations.
$$\frac{\partial^2 f}{\partial x^2}=0$$
$$\frac{\partial^2 f}{\partial x \partial y}=-\frac{1}{y^2}$$
$$\frac{\partial^2 f}{\partial y^2}=2\frac{x}{y^3}$$
And so the taylor series up to second order is given by:
$$\frac{x}{y} \approx \frac{\mu_x}{\mu_y}+\frac{1}{2}\Bigg(-\frac{1}{\mu_y^2}2(x-\mu_x)(y-\mu_y) + 2\frac{\mu_x}{\mu_y^3}(y-\mu_y)^2 \Bigg)$$
Taking expectations yields:
$$E\Big[\frac{x}{y}\Big] \approx \frac{\mu_x}{\mu_y}-\frac{1}{\mu_y^2}E\Big[(x-\mu_x)(y-\mu_y)\Big] + \frac{\mu_x}{\mu_y^3}E\Big[(y-\mu_y)^2\Big]$$
Which is the answer I gave. (although I initially forgot the minus sign in the second term)
The sum of $n$ independent Gamma random variables $\sim \Gamma(t_i, \lambda)$ is a Gamma random variable $\sim \Gamma\left(\sum_i t_i, \lambda\right)$. It does not matter what the second parameter means (scale or inverse of scale) as long as all $n$ random variable have the same second parameter. This idea extends readily
to $\chi^2$ random variables which are a special case of Gamma random variables.
Best Answer
This one (maybe surprisingly) can be done with easy elementary operations (employing Richard Feynman's favorite trick of differentiating under the integral sign with respect to a parameter).
We are supposing $X$ has a $\Gamma(\alpha,\beta)$ distribution and we wish to find the expectation of $Y=\log(X).$ First, because $\beta$ is a scale parameter, its effect will be to shift the logarithm by $\log\beta.$ (If you use $\beta$ as a rate parameter, as in the question, it will shift the logarithm by $-\log\beta.$) This permits us to work with the case $\beta=1.$
After this simplification, the probability element of $X$ is
$$f_X(x) = \frac{1}{\Gamma(\alpha)} x^\alpha e^{-x} \frac{\mathrm{d}x}{x}$$
where $\Gamma(\alpha)$ is the normalizing constant
$$\Gamma(\alpha) = \int_0^\infty x^\alpha e^{-x} \frac{\mathrm{d}x}{x}.$$
Substituting $x=e^y,$ which entails $\mathrm{d}x/x = \mathrm{d}y,$ gives the probability element of $Y$,
$$f_Y(y) = \frac{1}{\Gamma(\alpha)} e^{\alpha y - e^y} \mathrm{d}y.$$
The possible values of $Y$ now range over all the real numbers $\mathbb{R}.$
Because $f_Y$ must integrate to unity, we obtain (trivially)
$$\Gamma(\alpha) = \int_\mathbb{R} e^{\alpha y - e^y} \mathrm{d}y.\tag{1}$$
Notice $f_Y(y)$ is a differentiable function of $\alpha.$ An easy calculation gives
$$\frac{\mathrm{d}}{\mathrm{d}\alpha}e^{\alpha y - e^y} \mathrm{d}y = y\, e^{\alpha y - e^y} \mathrm{d}y = \Gamma(\alpha) y\,f_Y(y).$$
The next step exploits the relation obtained by dividing both sides of this identity by $\Gamma(\alpha),$ thereby exposing the very object we need to integrate to find the expectation; namely, $y f_Y(y):$
$$\eqalign{ \mathbb{E}(Y) &= \int_\mathbb{R} y\, f_Y(y) = \frac{1}{\Gamma(\alpha)} \int_\mathbb{R} \frac{\mathrm{d}}{\mathrm{d}\alpha}e^{\alpha y - e^y} \mathrm{d}y \\ &= \frac{1}{\Gamma(\alpha)} \frac{\mathrm{d}}{\mathrm{d}\alpha}\int_\mathbb{R} e^{\alpha y - e^y} \mathrm{d}y\\ &= \frac{1}{\Gamma(\alpha)} \frac{\mathrm{d}}{\mathrm{d}\alpha}\Gamma(\alpha)\\ &= \frac{\mathrm{d}}{\mathrm{d}\alpha}\log\Gamma(\alpha)\\ &=\psi(\alpha), }$$
the logarithmic derivative of the gamma function (aka "polygamma"). The integral was computed using identity $(1).$
Re-introducing the factor $\beta$ shows the general result is
$$\mathbb{E}(\log(X)) = \log\beta + \psi(\alpha)$$
for a scale parameterization (where the density function depends on $x/\beta$) or
$$\mathbb{E}(\log(X)) = -\log\beta + \psi(\alpha)$$
for a rate parameterization (where the density function depends on $x\beta$).