Stating your OP generically:
1. Logistic regression without explanatory variables:
$\color{red}{\text{log}} \,\left[\color{blue}{\text{ODDS(p(Y=1))}}\right]=\color{red}{\text{log}}\left(\frac{\hat p\,(Y=1)}{1-\hat p\,(Y=1)}\right) = \hat\beta_o$
$\hat\beta_o$ is the estimated log odds.
It is an intercept only construct. Exponentiating we get
$$\color{blue}{\text{ODDS(Y=1)}} = \frac{p\,(Y=1)}{1\,-\,p\,(Y=1)} = e^{\,\hat\beta_0}$$
$\color{blue}{\large e^{\hat\beta_o}}$ are the $\color{blue}{\text{ODDS}}$.
Translating into probabilities:
$\color{green}{\Pr(Y = 1)} = \frac{\color{blue}{\text{odds(Y=1)}}}{1\,+\,\color{blue}{\text{odds(Y=1)}}}=\frac{e^{\,\hat\beta_0 }}{1 \,+\,e^{\,\hat\beta_0}}$
This is the second calculation in the OP (i.e. the one containing $-1.12546$).
2. Logistic regression with explanatory variable:
$\color{red}{\text{log}} \,\left[\color{blue}{\text{ODDS(p(Y=1))}}\right]=\color{red}{\text{log}}\left(\frac{\hat p\,(Y=1)}{1-\hat p\,(Y=1)}\right) = \hat\beta_o+\hat\beta_1x_1$
$\color{blue}{\text{ODDS(Y=1)}} = \frac{p\,(Y=1)}{1\,-\,p\,(Y=1)} = e^{\,\hat\beta_0+\hat\beta_1x_1} \tag{*}$
Introducing the odds ratio:
If instead of $x_1$ in $(*)$ we have $x_1+1$ - a one-unit increase:
$$\color{blue}{\text{ODDS(Y=1)}} = \frac{p\,(Y=1)}{1\,-\,p\,(Y=1)} = e^{\,\hat\beta_0+\hat\beta_1x_1+ \hat \beta_1}$$
and
$$\color{green}{\text{ODDS RATIO}} = \frac{\color{blue}{\text{odds|}x_1+1}} {\color{blue}{\text{odds|}x_1}}= \frac{e^{\,\hat\beta_0+\hat\beta_1x_1+ \hat \beta_1}}{e^{\,\hat\beta_0+\hat\beta_1x_1}}= e^{\hat\beta_1}$$
$\color{green}{\large e^{\hat\beta_1}}$ is the $\color{green}{\text{ODDS RATIO}}$.
This is the first calculation in the OP.
For every unit increase in $x_1$ the odds increased by $e^{\hat\beta_1}$.
Hence,
$\color{red}{\log}\,[\color{green}{\text{ODDS RATIO}}] = \hat\beta_1$
Best Answer
I'm a little confused by your notion of coefficients interacting (I think you mean covariates, $x$'s). But I'll add some clarity around what it means to add coefficients.
We don't really say the "probability of $\beta_5 + \beta_6$", since these are coefficients. We can however ask about the change in log odds associated with a 1 unit increase in covariates $x_5$ and $x_6$, for example.
Let's say you have the following logistic regression model:
$$\text{logit} (P(Y=1|X=x)) = \alpha + \beta_1x_1 + \beta_2x_2 + \beta_3x_1x_2$$
You want to know if you can add $\beta_2 + \beta_3$? It's important to understand what the coefficient represents:
Let's say $X_2$ increased by 1 unit, the new log odds $P(Y=1|X)$ are:
$$\text{logit} (P(Y=1|X_1=x_1, X_2=x_2+1)) = \alpha + \beta_1x_1 + \beta_2(x_2+1) + \beta_3x_1(x_2 +1)$$
So the change in log odds associated with a 1 unit increase in $X_2$ is:
$$\frac{\text{logit} (P(Y=1|X_1=x_1, X_2=x_2+1))}{\text{logit} (P(Y=1|X_1=x_1, X_2=x_2))} = \frac{\alpha + \beta_1x_1 + \beta_2(x_2+1) + \beta_3x_1(x_2+1)}{\alpha + \beta_1x_1 + \beta_2x_2 + \beta_3x_1x_2}$$
$$ = \beta_2 + \beta_3x_1$$
So you can add the coefficients if you know your $x_1$ doesn't change. Of course, if it never changed then you should be modeling it as part of the intercept, $\alpha$.
I'm also not familiar with this notation. In fact if
both of which are quite odd for a probability.