I assume it should be true (and known) that $a + ab = a$.
Assuming this holds, let $x = a+(b+c)$ and $y = (a+b)+c$. We want to show that $x = y$, and following the hint we reduce to showing $x = xy = y$.
I claim that $ax = a, \ bx = b, \ cx = c$, and likewise for $y$. We check for $ax$:
$$ ax = aa + a(b+c) = a + a(b+c) = a$$
Likewise, for $bx$:
$$ bx = ba + b(b+c) = ba + (bb+bc) = ba + (b+bc) = ba + b = b$$
The remaining checks are analogous.
Using these identities, you can derive that anything made up of $a,b,c,+,.$ does not change when multiplied by $x$, in particular $yx = x$:
$$ yx = ((a+b)+c)x = (a+b)x+cx = (ax+bx)+cx = (a+b)+c = y$$
You can use a symmetric argument to conclude that $yx = xy = x$, and hence the claim follows.
For products, you can use a similar trick. Let $x = a.(b.c)$ and $y = (a.b).c$. I claim that $ x = x + y = y$. To see this, first note that $x + a = a$ (because $x+a = a+ a.(...) = a$. Secondly, $x+b = b$, because
$$ x+b = a.(b.c) + b = a.(b.c) + a.b + a'.b = a.(b.c+b) + a'.b = a.b + a'.b = b$$
(I hope this is legit). Likewise, $x+c = c$. Finally, $x + y = y$, because:
$$ y = (a.b).c = ((a+x).(b+x)).(c+x) = (a.b + x).(c+x) = (a.b).c + x = y + x$$
(I used the identity $(u+t).(v+t) = u.v + t.u + t.v + t.t = u.v +t$). The proof that $y+x = x$ is symmetric.
Generally speaking, associativity means instead of calculating the union from the main index set, we can break the main index set into a family of smaller index sets and calculate union for every smaller index set and then calculate the union of all the unions of the smaller index sets.
Actually there is no order in computing $\bigcup\limits_{k\in K} A_k$ and $\bigcup\limits_{j \in J}\bigg(\bigcup\limits_{i \in I_j} A_i\bigg)$.
The only order is that for the second formula you must first compute the inner parenthesis. I explain what the formula says by an example.
Suppose $K=\{a, 1, *\}$
$A_a = \{p,q,r\}$
$A_1=\{q,r,s\}$
$A_*=\{s,t\} $
Then you compute $\bigcup\limits_{k\in K} A_k = \bigcup\{A_a,A_1,A_*\}$ by definition and you get $\{p,q,r,s,t\}$ as the union of the collection $\{A_a,A_1,A_*\}$ and obviously there is no order. But you can compute the union another way. You can break the set $K$ into a collection of sets:
Suppose $J=\{11, 12\}$
$I_{11} = \{a, 1\}$
$I_{12} = \{*,1\}$
So now $K = \bigcup_{j \in J} I_j=\bigcup\{I_{11},I_{12}\}=\{a,1,*\}$
and we can compute $\bigcup\limits_{j \in J}\bigg(\bigcup\limits_{i \in I_j} A_i\bigg) = \bigcup \bigg\{\bigcup\limits_{i \in I_{11}}A_i,\bigcup\limits_{i \in I_{12}}A_i\bigg\} = \bigcup\bigg\{\bigcup\{A_a,A_1\},\bigcup\{A_*,A_1\}\bigg\} = \bigcup\bigg\{\{p,q,r,s\},\{q,r,s,t\}\bigg\}=\{p,q,r,s,t\}$
As for the question about commutativity, I think general commutativity means that in a family of sets if we change the function that associates a set to every index, then the union is not altered. For example:
$I = \{0,1,2\}$ is the index set
$B_0=\{a,b\}$
$B_1=\{b,c\}$
$B_2=\{e,f\}$
Now $\bigcup\limits_{i\in I}B_i = \bigcup\bigg\{B_0,B_1,B_2\bigg\}$
If we change the index function, for example by changing the sets of $B_0$ and $B_1$ with each other we get:
$B_0=\{b,c\}$
$B_1=\{a,b\}$
$B_2=\{e,f\}$
and the corresponding union does not change. Mathematically, If $\sigma$ is a permutation on the index set $I$ then $$\bigcup\limits_{k\in K} B_k=\bigcup\limits_{k\in K} B_{\sigma(k)}$$
I wrote about commutativity by getting help from question number 4 in the list of questions I have studied in this website about the topic.
Finally I think it only means to talk about associativity and commutativity when we have a family, namely an indexed collection and as Halmos says every collection can be indexed and the index set can be simply the collection and the function is the identity function. So talking about associativity and commutativity and formulating them and proving them is not irrelevant.
Best Answer
Let us calculate those terms:
\begin{align} a + L &= a + ( (a * b) * c )\\ &= (a + (a * b)) * (a + c)\tag{distributivity}\\ &= a * (a + c)\tag{absortion}\\ &= a.\tag{absortion} \end{align}
\begin{align} a + R &= a + ( a * (b*c) )\\ &= a. \tag{absortion} \end{align} Thus, $a+L=a+R$.
\begin{align} a' + R &= a'+ (a * (b*c))\\ &= (a' + a) * (a' + (b*c))\tag{distributivity}\\ &= 1 * ((a' + b) * (a' + c))\tag{distributivity, complement}\\ &= (a' + b) * (a' + c).\tag{identities} \end{align}
\begin{align} a' + L &= a' + ((a*b)*c)\\ &= (a' + (a*b)) * (a' + c)\tag{distributivity}\\ &= ((a' + a) * (a' + b))*(a' + c)\tag{distributivity}\\ &= ( 1 * (a' + b) ) * (a' + c)\tag{complement}\\ &= (a' + b) * (a' + c), \end{align} and so $a' + R = a' +L$.
Now given that $a+L=a+R$ and $a' + R = a' +L$,
\begin{align} L &= (a * a') + L \\ &= (a + L) * (a' + L) \\ &= (a + R) * (a' + R)\\ &= (a * a') + R\\ &= R, \end{align} where by now, I suppose you can justify the above equalities.