The set of real numbers with standard addition and scalar multiplication can be expressed as $(\Bbb{R},+,\times). $
This can also be expressed as a subspace of $\Bbb{R}^n$, namely $\Bbb{R}^1$. Thus, if we prove the ten axioms for $\Bbb{R}^n$ then we need to only prove thereon that $\Bbb{R}^1$ is closed under addition and scalar multiplication, as is the virtue of subspaces.
Let $\mathbf{u,v,w} \in \Bbb{R}^n$ given that $\mathbf{u}=(u_1,...,u_2), \mathbf{v}=(v_1,...,v_2)$, and $\mathbf{w}=(w_1,...,w_2)$. Now,
(1) $\mathbf{u+v}=(u_1+v_1,...,u_n+v_n)$.
(2) $k\cdot \mathbf{u} =(ku_1,...,ku_n)$.
These two new vectors are clearly $n$-tuples of vectors in the vector space. Thus $\Bbb{R}^n$ is closed under addition and scalar multiplication. Now, to prove $\mathbf{u+(v+w)=(u+v)+w}$ we perform the following:
$$\mathbf{u+(v+w)}=(u_1,...,u_n)+((v_1,...,v_n)+(w_1,...,w_n))$$
$$=(u_1+(v_1+w_1))+...+(u_n+(v_n+w_n))$$
$$=((u_1+v_1)+w_1)+...+((u_n+v_n)+w_n)$$
$$=\mathbf{(u+v)+w}.$$
The rest can be proven in a similar matter. Once the ten axioms for $\Bbb{R}^n$ are proven, then to prove that $\Bbb{R}^1$ is a subspace of $\Bbb{R}^n$, and hence a vector space, then you just need to prove closure under addition and scalar multiplication, which I did already for $\Bbb{R}^n$.
I think (although I might be wrong) that you're confused with what the word "additive" refers to. In the context of this question, "additive" refers not to the usual addition of vectors, but to the operation defined in this question. In a vector space $V$, the definition of the inverse element (which, by the way, has to be given only after the identity element has been defined) states:
For an element $v\in V$, its inverse element is the element $v'\in V$ such that $v+v'=v'+v=\mathrm{Id}$.
In this example we've already established that $(1,1)\in\mathbb{R}^2$ would have to be the identity element. But then for any $v=(x,y)\in\mathbb{R}^2$ we have
$$(0,0)+(x,y)=(0x,0y)=(0,0)\neq(1,1)=\mathrm{Id}.$$
This computation shows two things for us:
- $(0,0)$ does not have an additive inverse with respect to this operation because adding any element to it never results in the identity element;
- The claim that "$(0,0)$ in this question is the additive inverse of all vectors in the set $\mathbb{R}^2$" is not true for the same reason.
EDIT. I forgot to answer your last question. The word "unique" means that there's only one such element: if $v'$ and $v''$ are two additive inverses of the same element $v\in V$, then necessarily $v'=v''$. This uniqueness property is not required in the axioms of a vector space, but it's easily deduced from them, so the bottom line is that it always holds in a vector space. I don't feel like this property is directly related to your task here, since you're asked to verify the axioms, not other properties of vector spaces.
Best Answer
Choose another notation $x \oplus y := xy$ and $c \otimes x := x^c$. Then the exponential map gives an isomorphism of structures $(\mathbb{R},+,*) \cong (\mathbb{R}^+,\oplus,\otimes)$. Since the first is a vector space, the same is true for the latter. And this way the creator of this "exercise" came up with this artificial vector space (he wanted that you waste your time with computations ...).