When any random variable $x$ has an distribution $X$ then any function y=$f(x)$ will have generally speaking a different distribution $Y$ deducted from $X$. You confuse distributions and variables.
So the positively valued $x$ has log-normal distribution iff the new variable $y=\ln(x)$ has a normal distribution $N(\mu,\sigma)$.
Log-normal distribution's density itself has an analytical form:
$$
Y(x;\mu,\sigma) = \frac{1}{x \sigma \sqrt{2 \pi}}\, e^{-\frac{(\ln x - \mu)^2}{2\sigma^2}},\ \ x>0
$$
where $\mu$ and $\sigma^2$ respectively mean and variance of the corresponding normal distribution see change-of-variables rule.
Using Bayes' Rule, the posterior distribution of the variance is given by the product of the likelihood and the prior distribution of the variance divided by the marginal likelihood
$$p(\sigma^2|t_1,t_2,\cdots,t_n)=\frac{p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)}{ p(t_1,t_2,\cdots,t_n)}=\frac{p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)}{\int p(t_1,t_2,\cdots,t_n|\sigma^2)p(\sigma^2)d(\sigma^2)} $$
As the observed values $t_i$ are i.i.d. (with zero mean) the likelihood function $p(t_1,t_2,...,t_n|\sigma^2)$ is given by
$$p(t_1,t_2,...,t_n|\sigma^2)=\prod_{i=1}^n\frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{t_i^2}{2\sigma^2}\right)=\frac{1}{\sigma^n(2\pi)^{n/2}}\exp\left(-\frac{1}{2\sigma^2}\sum_{i=1}^nt_i^2\right)$$
while the prior on the variance is simply the following probability mass function
$$p(\sigma^2=1)=\frac{1}{2},\ p(\sigma^2=4)=\frac{1}{2}$$
Given the discrete nature of the prior of the variance, the marginal likelihood simplifies to
$$\begin{align}p(t_1,t_2,\cdots,t_n)&=p(t_1,t_2,\cdots,t_n|\sigma^2=1)p(\sigma^2=1)+p(t_1,t_2,\cdots,t_n|\sigma^2=4)p(\sigma^2=4)\\&=\frac{1}{2}\left[\frac{1}{(2\pi)^{n/2}}\exp\left(-\frac{1}{2}\sum_{i=1}^nt_i^2\right)+\frac{1}{2^n(2\pi)^{n/2}}\exp\left(-\frac{1}{8}\sum_{i=1}^nt_i^2\right)\right]\end{align}$$
In order to use the MAP rule to determine whether the samples were drawn from a distribution where the variance was $1$, we need to satisfy the following inequality (have a look at Equations $3.4$ and $3.5$ in here):-
$$\frac{p(t_1,t_2\cdots,t_n|\sigma^2=4)p(\sigma^2=4)}{p(t_1,t_2,\cdots,t_n)}<\frac{p(t_1,t_2\cdots,t_n|\sigma^2=1)p(\sigma^2=1)}{p(t_1,t_2,\cdots,t_n)}$$
which (given the prior of the variance) simplifies to
$$p(t_1,t_2\cdots,t_n|\sigma^2=4)<p(t_1,t_2\cdots,t_n|\sigma^2=1)$$
which is the condition that
$$\frac{1}{2^n(2\pi)^{n/2}}\exp\left(-\frac{1}{8}\sum_{i=1}^nt_i^2\right)<\frac{1}{(2\pi)^{n/2}}\exp\left(-\frac{1}{2}\sum_{i=1}^nt_i^2\right)$$
Taking the logarithm of both sides and simplifying results in the following inequality
$$\frac{3}{8n\log 2}\sum_{i=1}^nt_i^2<1$$
Thus, we have $c_1=\frac{3}{8n\log 2}$ and $c_2=0$.
Best Answer
Let us denote with $A_F,A_{S}$ a fail/success by Anne and with $B_F,B_S$ a fail/success by Betty.
If infinite attempts are allowed and Anne starts first, the probability that Anne clears the measure before Betty is the probability of the events $$ A_S,\quad A_F B_F A_S,\quad A_F B_F A_F B_F A_S,\quad \ldots $$ whose total probability is given by $$ \frac{1}{3}+\frac{2}{3}\cdot\frac{3}{4}\cdot\frac{1}{3}+\ldots = \frac{1}{3}\sum_{k\geq 0}\left(\frac{1}{2}\right)^k = \frac{2}{3}. $$ I leave to you to work out the case in which Betty starts first, through a similar argument.