One has the following quotient map $q : (Y \times I, Y \times \partial I) \to (Y \times S^1, Y \times s_0)$ obtained from the equivalence relation $\sim$ on $Y \times I$ defined on $\partial I \times Y$ by $(y, 0) \sim (y, 1)$. One can prove that $q_*$ on cohomology is an isomorphism as follows.
$$\require{AMScd}
\begin{CD}
H^k(Y \times I, Y \times \partial I) @<{q_*}<< H^k(Y \times S^1, s_0 \times Y)\\
@A{\cong}AA @A{\cong}AA \\
H^k(Y \times I/Y \times \partial I, pt) @<{\cong}<< H^k(Y \times S^1/s_0 \times Y, pt)
\end{CD}$$
The vertical maps are isomorphisms because $(Y \times I, Y \times \partial I)$ and $(Y \times S^1, s_0 \times Y)$ are both good pairs, as so are $(I, \partial I)$ and $(S^1, pt)$. The bottom horizontal map is obtained from the obvious homeomorphism between $Y \times I/\partial I \times Y$ and $S^1 \times Y/s_0 \times Y$, hence is also an isomorphism. The diagram commutes, hence $q_*$ is also an isomorphism. By naturality of cross product, we have the following commutative diagram,
$$\require{AMScd}
\begin{CD}
H^n(Y; R) @>{\times \alpha}>> H^{n+1}(Y \times I, Y \times \partial I; R)\\
@A\text{id}AA @A{q_*}AA \\
H^n(Y; R) @>{\times \alpha'}>> H^{n+1}(Y \times S^1, Y \times s_0 ;R);
\end{CD}$$
Where $\alpha$ is the generator of $H^1(I, \partial I; R)$ and $\alpha'$ is the generator of $H^1(S^1, s_0)$. As the top map and the two vertical maps are both isomorphisms, the bottom map is too.
For the second statement, note that we have the following split short exact sequence
$$0 \to H^{n+1}(Y \times S^1, Y \times \{s_0\}; R) \stackrel{\pi^*}{\to} H^{n+1}(Y \times S^1; R) \to H^{n+1}(Y \times \{s_0\}; R) \to 0$$
obtained from the long exact sequence for $(Y \times S^1, Y \times \{s_0\})$, via the section obtained from the induced map of the retraction $r : Y \times S^1 \to Y$ on cohomology.
According to splitting lemma, $H^{n+1}(Y \times S^1, Y \times s_0; R) \times H^{n+1}(Y \times s_0; R) \cong H^{n+1}(Y \times S^1; R)$ where the isomorphism is given by $(\beta, \beta') \mapsto \pi^*(\beta) +r^*(\beta')$. Using the previous isomorphism, this means we have an isomorphism $$H^n(Y; R) \times H^{n+1}(Y; R) \to H^{n+1}(Y \times S^1; R)$$ given by $(\beta_1, \beta_2) \mapsto \pi^*(\alpha \times \beta_1) + r^*(\beta_2) = \alpha \times \beta_1 + 1 \times \beta_2$ by removing the $\pi^*$ because it's not just an embeddeding but an actual inclusion, and $r^*(\beta_2) = 1 \times \beta_2$ as the retraction $r$ is nothing but projection onto 1st coordinate.
Note that $T = S^1\times S^1$, and the Künneth formula gives you an algebra isomorphism
$$H^\bullet (T) \cong H^\bullet (S^1) \otimes H^\bullet (S^1).$$
(In general, you should be careful about that, but in your case everything is fine, as $H^n (S^1)$ are obviously finitely generated free). See e.g. Chapter 3 of Hatcher, the section "A Künneth Formula" for details.
Now calculate the cohomology ring of the circle.
Best Answer
That looks right to me (up to a sign convention). The way I checked it is to use Poincare duality, which relates cup product to signed intersection number: look at the vertex $v \in X$ that is the result of gluing the eight corners of the octagon, then look at the four oriented loops $L_a,L_b,L_c,L_d \subset X$ that pass through $v$ and that come from gluing each of the four side pairs $a,b,c,d$, and then check the right-hand rule. It looks to me like you are using a sign convention where the "right hand rule" yields coefficient $-1$ and the "left hand rule" yields the coefficient $+1$. So, for example, the pair $L_a$,$L_b$ passes through $v$ obeying the right hand rule, which gives coefficient $-1$, hence $[f_a] \cup [f_b] = [-f_T]$ (I'm using square braces to denote cohomology classes of cocycles).
Regarding what may or may not look weird, I should add that different gluing patterns are going to give different matrices for the cup product. Your gluing pattern gives this matrix $$\begin{pmatrix} 0 & -1 & -1 & -1 \\ 1 & 0 & -1 & -1 \\ 1 & 1 & 0 & -1 \\ 1 & 1 & 1 & 0 \end{pmatrix} $$ whereas the diagram on the right of the post in your question will give this matrix (which perhaps looks less weird): $$\begin{pmatrix} 0 & -1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & -1 \\ 0 & 0 & 1 & 0 \end{pmatrix} $$