The joint probability density function of $A \sim N(0,\delta_1^2)$ and $B \sim N(0,\delta_2^2)$ is a degenerate joint density since all the mass lies along a straight line through the origin instead of being spread all over the plane. Problems involving $A$ and $B$ are best solved in terms of $s$ alone. For example, $$\begin{align}P\{A\leq a, B\leq b\}=F_{A,B}(a,b)&=P\left\{s\leq\frac{a}{\delta_1},s\leq\frac{b}{\delta_2}\right\}\\&=P\left\{s\leq\min\left(\frac{a}{\delta_1},\frac{b}{\delta_2}\right)\right\}\\&=\Phi\left(\min\left(\frac{a}{\delta_1},\frac{b}{\delta_2}\right)\right),\end{align}$$
Is the distribution of $(X_1,X_2)$ determined by the distributions of $X_1$, $X_2$ and $X_1+X_2$?
In the discrete case, a simple dimensional analysis shows that the answer is "No", in general.
Assume that $(X_1,X_2)$ takes values in $\{0,1,\ldots,n\}\times\{0,1,\ldots,m\}$ then $X_1+X_2$ takes values in $\{0,1,\ldots,n+m\}$. The distributions of $X_1$, $X_2$, $X_1+X_2$ and $(X_1,X_2)$ depend on $n$, $m$, $n+m$ and $nm+n+m$ parameters respectively. Hence the three first distributions cannot determine the last distribution when $nm+n+m\gt n+m+(n+m)$, that is, when $(n-1)(m-1)\geqslant2$, for example if $\{n,m\}=\{2,3\}$.
A fortiori the answer is also "No" when the distributions of $X_1$, $X_2$ and $X_1+X_2$ are absolutely continuous. An example is when $X_1$ and $X_2$ are Cauchy with parameter $a$ and $X_1+X_2$ is Cauchy with parameter $2a$. Then $(X_1,X_2)$ can be independent, or one can have $X_1=X_2$ with full probability.
Best Answer
Based on your most recent comment, I think you should consider a 2-state Markov chain to produce a sequence of random variables $X_i,$ taking values in $\{0, 1\},$ roughly as follows:
Start with a deterministic or random $X_1.$ Then
(i) $P\{X_{i+1} = 1|X_i = 0\} = \alpha,$ and (ii) $P\{X_{i+1} = 0|X_i = 1\} = \beta.$
The parameters $\alpha$ and $\beta$ are the respective probabilities of 'changing state' from one $X_i$ to the next. To avoid certain kinds of deterministic sequences, you may want to use $0 < \alpha, \beta < 1.$ If $\alpha = 1 - \beta,$ then the sequence is independent.
By induction, one can show that $$P\{X_{1+r} = 0|X_1 = 0\} = \frac{\beta}{\alpha+\beta} + \frac{\alpha(1-\alpha - \beta)^r}{\alpha+\beta}.$$ If $|1-\alpha - \beta| < 1$, then in the long run $P\{X_n = 0\} \approx \beta/(\alpha+\beta),$ regardless of the value of $X_1.$
Moreover, there are similar formulas for the '$r$-step transitions' from 0 to 1, 1 to 0, and 1 to 1. Of course, I am skipping over a lot of detail here.
Perhaps there is a rich enough variety of models here to satisfy your curiosity as to what happens when independence fails in this way.
Later chapters in many probability books have a complete development of the theory of 2-state Markov chains. Also there are several good elemeentary books just on Markov chains. [Google '2-state Markov Chain'. One reference among many is Chapter 6 of Suess and Trumbo (2010), Springer, in which I have a personal interest.]