OK, now that you have updated your question to include the two formulas:
$$P(A\mid B) = \frac{P(A\cap B)}{P(B)} ~~ \text{provided that }
P(B) > 0, \tag{1}$$
is the definition of the conditional probability of $A$ given that
$B$ occurred. Similarly,
$$P(B\mid A) = \frac{P(B\cap A)}{P(A)} = \frac{P(A\cap B)}{P(A)} ~~ \text{provided that }
P(A) > 0, \tag{2}$$
is the definition of the conditional probability of $B$ given that
$A$ occurred. Now, it is true that it is a trivial matter to
substitute the value of $P(A\cap B)$ from $(2)$ into $(1)$ to
arrive at
$$P(A\mid B) = \frac{P(B\mid A)P(A)}{P(B)} ~~ \text{provided that }
P(A), P(B) > 0, \tag{3}$$
which is Bayes' formula but notice that Bayes's formula actually connects two different conditional probabilities $P(A\mid B)$
and $P(B\mid A)$, and is essentially a formula for "turning the
conditioning around". The Reverend Thomas Bayes referred to this
in terms of "inverse probability" and even today, there is
vigorous debate as to whether statistical inference should be
based on $P(B\mid A)$ or the inverse probability (called
the a posteriori or posterior probability).
It is undoubtedly as galling to you as it was to me when I first
discovered that Bayes' formula was just a trivial substitution of
$(2)$ into $(1)$. Perhaps if you have been born 250 years ago,
you (Note: the OP masqueraded under username AlphaBetaGamma when I wrote this answer but has since changed his username) could have made the substitution and then
people today would be talking about the AlphaBetaGamma formula and the
AlphaBetaGammian heresy and the Naive AlphaBetaGamma method$^*$ instead
of invoking Bayes' name everywhere. So
let me console you on your loss of fame by pointing out a different
version of Bayes' formula. The Law of Total Probability says
that
$$P(B) = P(B\mid A)P(A) + P(B\mid A^c)P(A^c) \tag{4}$$
and using this, we can write $(3)$ as
$$P(A\mid B) = \frac{P(B\mid A)P(A)}{P(B\mid A)P(A) + P(B\mid A^c)P(A^c)}, \tag{5}$$
or more generally as
$$P(A_i\mid B) = \frac{P(B\mid A_i)P(A_i)}{P(B\mid A_1)P(A_1) + P(B\mid A_2)P(A_2) + \cdots + P(B\mid A_n)P(A_n)}, \tag{6}$$
where the posterior probability of a possible "cause" $A_i$ of a
"datum" $B$ is related to $P(B\mid A_i)$, the likelihood of the
observation $B$ when $A_i$ is the true hypothesis and $P(A_i)$, the prior probability
(horrors!) of the hypothesis $A_i$.
$^*$ There is a famous paper R. Alpher, H. Bethe, and
G. Gamow, "The Origin of Chemical Elements", Physical Review, April 1, 1948, that is commonly referred to as
the $\alpha\beta\gamma$ paper.
In general you use Bayes factors to determine the odds of one hypothesis with respect to another hypothesis. In determining both probabilities you need to calculate an integral or a sum if you put prior probability on a finite number of points.
You say you do not have priors, actually you do but they aren't showing up because they are constants with respect to the parameter(s) you place priors over so the prior in the second integration is uniform on (0,1).
The integrals you have numerically calculated are integrating out the parameter. In general a Bayes factor is integrating out the uncertainty in the parameter. The priors quantify the uncertainty in the value of the parameter. In the code you have written where you integrate over the Binomial probability by placing a prior on the parameter $p$ and integrating over that parameter.
Both priors that you have written are priors conditional on hypotheses, call these hypotheses $M_1$ and $M_2$. Now consider the formula:
$K = \frac{\Pr(D|M_1)}{\Pr(D|M_2)}
= \frac{\int \Pr(p_1|M_1)\Pr(D|p_1,M_1)\,dp_1}
{\int \Pr(p_2|M_2)\Pr(D|p_2,M_2)\,dp_2}$.
The first entries in the numerator and the denominator are your priors and these priors are conditional on a hypothesis (Beta distribution with hyper-parameters specified). Then to get to a posterior ratio of the two you specify priors on the two hypotheses $M_1$ and $M_2$. If these are the same number ($1/2$) then $K$ is your Bayes factor.
Best Answer
I think you are making a mistake in the denominator of the former. $P(A) = \sum_B P(A|B) P(B)$
Since rushing made me write something kinda terrible I'll pay some penance by writing out a more full solution!
The Bayes Approach to the Monty Hall Problem
In general I think you are making the Monty Hall problem a little bit more confusing when you omit the player's choice. The player chooses an arbitrary door so without loss of generality let's call it one. I believe this is what you did, but let's just be a bit more explicit. Next we write out the probability:
$P(H=1|D=3) = \frac{P(H=1) P(D=3|H=1)}{P(D=3)} $
We can calculate the numerator as $P(H=1) = \frac{1}{3}$ since each door is equally likely. We also calculate $P(D=3|H=1) = \frac{1}{2}$ since if $H=1$ Monty can pick either door without showing the grand prize. Moving on:
$P(H=1|D=3) = \frac{\frac{1}{3} \frac{1}{2}}{P(D=3|H=1)*P(H=1)+P(D=3|H=2)P(H=2)+P(D=3|H=3)P(H=3)}$
Next we have a set of terms to consider: $P(H=i)$ is $\frac{1}{3}$ for the same reason as above. Finally the conditionals: $P(D=3|H=1)=\frac{1}{2}$ as above. $P(D=3|H=2) = 1$ this is because Monty can't show door 1 since the player picked it, Monty can't show door 2 because it has the grand prize, and thus Monty must show door 3. Finally there is $P(D=3|H=3) = 0$ this is because Monty can't show the prize. Thus:
$P(H=1|D=3)= \frac{\frac{1}{3} \frac{1}{2}}{(\frac{1}{2}+ 1+ 0)(1/3)} = \frac{\frac{1}{3}\frac{1}{2}}{\frac{1}{3} \frac{3}{2}}=1/3$
Lonely Monty Hall Problem
This is not to be interpreted as the standard Monty Hall problem. Say that we revisit the problem without a player. There are again three doors, one of which has a grand prize behind it. Monty will choose and open a door that doesn't have the grand prize behind it and we are tasked with evaluating the probability that the grand prize is behind door 1.
The original write up of the question doesn't explicitly include the player so this is a fair interpretation.
Calculation via Bayes Theorem:
Let's start by just writing out the expression:
$P(H=1|D=3) = \frac{P(H=1) P(D=3|H=1)}{P(D=3)}$
Working from here $P(H=1) = \frac{1}{3}$ and $P(D=3|H=1)=\frac{1}{2}$ as above. We would then expand the denominator:
$P(D=3) = \sum_i P(D=3|H=i) P(H=i)$
Now what's different is that the player has not made a choice on door so we need to calculate the various conditional probabilities. $P(D=3|H=1) = \frac{1}{2}$ since Monty has two choices open door 2 or door 3. Similarly, $P(D=3|H=2) = \frac{1}{2}$, finally $P(D=3|H=3) = 0$ since Monty can not reveal the prize.
Overall this gives:
$P(H=1|D=3) = \frac{\frac{1}{3} \frac{1}{2}}{\frac{1}{3} \frac{1}{2}+\frac{1}{3} \frac{1}{2}+\frac{1}{3} 0} = \frac{1}{2}$