For any commutative ring $R,$ there is a *unique* homomorphism $f : \Bbb Z\to R,$ as specifying that $1\mapsto 1$ tells you where each element of $\Bbb Z$ must map to. Concretely, for any positive $n\in\Bbb Z,$ you have $n = 1 + \dots + 1$ ($n$ times,) so that $$f(n) = f(1 + \dots + 1) = f(1) + \dots + f(1) = n\cdot f(1).$$
You also know that $f(0) = 0,$ and if $n\in\Bbb Z$ is negative, then $f(n) = f(-(-n)) = -f(-n).$ Thus, there is exactly one morphism $f : \Bbb Z\to R,$ because the image of any element is determined by where $1$ is sent and the ring homomorphism rules.

Now, let's examine what it takes to define a morphism $\Bbb Z[x]\to R.$ We already know that we don't have a choice for where $\Bbb Z\subseteq\Bbb Z[x]$ is sent. What about $x$? Well, it turns out we can send $x$ to *any element of $R$ that we want.*

Suppose that $r\in R.$ If $f : \Bbb Z[x]\to R$ is a ring homomorphism sending $x$ to $r,$ then the ring homomorphism properties imply that we must have
\begin{align*}
f\left(\sum_{i = 0}^n a_i x^i\right) &= \sum_{i = 0}^nf\left( a_i x^i\right)\\
&= \sum_{i = 0}^n f(a_i) f(x^i)\\
&= \sum_{i = 0}^n f(a_i) f(x)^i\\
&= \sum_{i = 0}^n f(a_i) r^i.
\end{align*}
Every element of $\Bbb Z[x]$ is of the form $\sum_{i = 0}^n a_i x^i$ for some $n$ and some collection of integers $a_i,$ so we see that specifying where $x$ is sent determines the entire homomorphism. In particular, it is given by
$$
p(x) = \sum_{i = 0}^n a_i x^i \mapsto \sum_{i = 0}^n f(a_i) r^i = p(r).
$$

Conversely, setting $f(x) = r$ and extending in the above way is always a ring homomorphism. That is, let $r\in R$ and define
\begin{align*}
f : \Bbb Z[x]&\to R\\
\sum_{i = 0}^n a_i x^i &\mapsto \sum_{i = 0}^n g(a_i) r^i,
\end{align*}
where $g : \Bbb Z\to R$ is the unique ring homomorphism from paragraph one. We still have $f(1) = g(1) = 1$ and for any $n,m\in\Bbb Z\subseteq\Bbb Z[x],$ we have $$f(n + m) = g(n + m) = g(n) + g(m) = f(n) + f(m).$$ Suppose we have two arbitrary polynomials $\sum_{i = 0}^n a_i x^i,$ $\sum_{i = 0}^m b_i x^i.$ Then without loss of generality $m\leq n,$ and we can say that $\sum_{i = 0}^m b_i x^i = \sum_{i = 0}^n b_i x^i,$ where we define $b_j = 0$ for $j > m.$ Then we have
\begin{align*}
f\left(\sum_{i = 0}^n a_i x^i + \sum_{i = 0}^n b_i x^i\right) &= f\left(\sum_{i = 0}^n (a_i + b_i) x^i\right)\\
&= \sum_{i = 0}^n g(a_i + b_i)r^i\qquad\textrm{(by definition)}\\
&= \sum_{i = 0}^n \left(g(a_i) + g(b_i)\right)r^i\\
&= \sum_{i = 0}^n (g(a_i)r^i + g(b_i)r^i)\\
&= \sum_{i = 0}^n g(a_i)r^i + \sum_{i = 0}^n g(b_i)r^i\\
&= f\left(\sum_{i = 0}^n a_i x^i\right) + f\left(\sum_{i = 0}^n b_i x^i\right).
\end{align*}
You can check similarly that $$f\left(\left(\sum_{i = 0}^n a_i x^i\right)\cdot\left(\sum_{i = 0}^n b_i x^i\right)\right) = f\left(\sum_{i = 0}^n a_i x^i\right)\cdot f\left(\sum_{i = 0}^n b_i x^i\right),$$
so that the map defined is indeed a ring homomorphism.

The case of two variables is similar - a ring homomorphism is completely determined by where each variable is sent, and any choice of $r,s\in R$ gives a ring homomorphism with $x\mapsto r$ and $y\mapsto s.$

What's happening is that $\Bbb Z[x,y]$ is the free commutative ring on two generators $x$ and $y,$ which means essentially what I stated above - a ring homomorphism from $\Bbb Z[x,y]$ is given by a choice of where $x$ and $y$ will be sent, and any choices will work. To make sure that this defines a map on the quotient you want, you need to further specify that the images of $x$ and $y$ satisfy the given relation - i.e., because $x^3 + y^2 - 1 = 0$ in $\Bbb Z[x,y]/(x^3 + y^2 - 1),$ we must have $f(x)^3 + f(y)^2 - 1 = 0$ as well.

In 5, the most important part is to check that addition and multiplication are well defined on $R / I$ - that if $a + I = b + I$ and $c + I = d + I$ then $(a + I) + (b + I) = (c + I) + (d + I)$ and $(a + I) \cdot (b + I) = (c + I) \cdot (d + I)$.

For addition: if $x \in (a + I) + (b + I)$ then $x = a + u + b + v$ for some $u,v \in I$. If $a + I = c + I$ and $b + I = d + I$ then $a + p = c + q$ for some $p, q \in I$, and so $a = c + \alpha$ for some $\alpha \in I$, analogously $b = d + \beta$. Then $x = c + (\alpha + u) + d + (\beta + v)$. As $I$ is closed under addition, $\alpha + u \in I$ and $\beta + v \in I$, so $x \in (c + I) + (d + I)$. Can you make similar prove to addition? (this is where you need $I$ to be an ideal and not just a subring)

Other parts are correct, but can be shortened. I think it will be useful to prove that you don't need to prove ideal be subring: any subset $I$ that is additive subgroup s.t. $IR\subseteq I$ and $RI \subseteq I$ is automatically a subring.

You also don't actually need unity in $R$ in 3 and 4. For example, in $4$, to show $RI \subseteq I$ we can simply use $R\cdot I = R\cdot (\cap I_j)$ (definition), $R\cdot (\cap I_j) \subseteq \cap (R \cdot I_j)$ (general property: $f(X, \cap Y_i) \subseteq \cap f(X, Y_i)$ and $\cap (R \cdot I_j) \subseteq \cap I_j$ (all $I_j$ are ideals).

## Best Answer

For the first property, use induction to show that $\phi(na) = n\phi(a)$ for all $a$ and $n\geq 0$. This is actually what you have done.

For negative $n$ one must be careful. For $n>0$ put $(-n)a = -(na)$, which is the additive inverse of $na$. This part is then true simply by definition.