Using Bayes rule.
$F$: denote the event you picked the fair coin
$B$: denote the event you picked the biased coin
$D$: Data collected, i.e., observed 3 heads in 3 tosses
We want to calculate: $P(F|D)$ and $P(B|D)$
Assume that the trials are independent. This is equivalent to saying:
$P(D|F)=1/8$
$P(D|B)=1$
Let us also assume that $P(F)=P(B)=0.5$, implying that "you randomly pick coin" with equal chances of picking either one.
Apply Bayes theorem, which basically starts from:
$P(B|D)P(D)=P(D|B)P(B)$
Bring over the second term on the left hand side to the right hand side and expand by considering total probability as:
$P(D)=P(D|F)P(F)+P(D|B)P(B) = 1/8*1/2+1*1/2$
$P(B|D) = [P(D|B)P(B)]/[(P(D|F)P(F)+P(D|B)P(B))]$
I have got $\dfrac{8}{9}$ for biased coin.
By constant you mean nonrandom ? If there is a deterministic bound $n$ on the number of flips you need, then your coin is a random variable $X$ defined on $\{0,1\}^n$ and necessarily $p = P(X=1) = 2^{-n} \mathrm{Card}\{\omega \in \{0,1\}^n, X(\omega) = 1\}$ is a multiple of $2^{-n}$ so the binary representation terminates.
Regarding your randomized algorithm, when $p$ is non-dyadic it is optimal in expectation as explained by Lumbroso in appendix B of this paper https://arxiv.org/pdf/1304.1916.pdf (Lumbroso makes a small mistake : his computation does not work when $p$ is dyadic in which case you can stop the procedure after $2^{-n}$ steps for a small gain.
His optimality bound comes from the wonderful results of Knuth and Yao about the optimal generation of general discrete random variables from coin tosses (which they call DDG trees). I put the reference below, you can find it on libgen, it is a real gem and a great read. Lumbroso's paper also describes a neat optimal way do generate discrete uniform variables.
If you want many independent samples of your biased coin, you can get a performance boost up to $H(p) = p \log(1/p) + (1-p)\log(1/(1-p))$ expected bits per sample by reusing randomness from one sample to the next. https://arxiv.org/pdf/1502.02539.pdf $H(p)$ is the Shannon entropy of the distribution and is a hard lower bound. Lumbroso also does this in his paper for the discrete uniform case.
Knuth, Donald E.; Yao, Andrew C., The complexity of nonuniform random number generation, Algorithms and Complexity, Proc. Symp., Pittsburgh 1976, 357-428 (1976). ZBL0395.65004.
Best Answer
Yes there is. Suppose you flip a coin randomly, generating a infinite sequence of heads and tails given by $f_1,f_2,...$, representing tails as $0$ and heads as $1$. Then if you use those flips for binary digits and let $$ f=0.f_1f_2f_3... $$ then there is exactly a probability $p$ that $f<p.$
In principle this could take infinitely many flips to determine, but if you were satisfied with finding an event of probability $p$ when $p=\frac{k}{2^n}$, then finding out whether $f<p$ would take at most $n$ flips.