The property that distinguishes addition from multiplication is the distributive law. This is part of the convention of calling the operations addition and multiplication. If the distributive law wasn't there, we wouldn't call the operations by those names.
The fact that multiplication distributes over addition admits $0\cdot a = 0 \;\forall a$ where $0$ is the additive identity. There is no such analog for addition. This is one property that falls out of imposing the distributive law.
Another necessary property is that if every element has an additive inverse, then addition must be commutative. Observe the commutativity of addition derived from the distributive law and the existence of additive inverses:
$$\begin{align}(1+1)(a+b)\quad &= 1(a+b) + 1(a+b)\\
&= (1+1)a + (1+1)b\end{align}$$
$$a+b+a+b\quad =\quad a+a+b+b$$
$$b+a\quad=\quad a+b$$
Besides this, the addition and multiplication operations on a set $S$ are simply functions:
$$\begin{align}
+&\;:\;S\times S \rightarrow S\\
\cdot\:&\;:\;S\times S \rightarrow S
\end{align}$$
They can be any mapping we choose so long as the distributive law is upheld.
The commutativity of multiplication can be relaxed, but as explained, commutativity of addition is required if every element has an additive inverse.
Let's start very fundamentally. A lot of mathematics is concerned about sets of objects and sets of functions mapping one or more of those objects to another one.
Now let's look at the case that we have a set $X$ and a function $f$ takes two elements of that set and gives a third one, that is, $f:X\times X\to X$. Note that this is just an arbitrary function acting on two elements of an arbitrary set. Now if we need such a function often, we like to write $f(x,y)$ in a slightly simpler form. This usually is something like $x*y$, $x\cdot y$ or simply $xy$, but especially if $f(x,y)=f(y,x)$ (and in some rare cases, even if not), it is not uncommon to write $f(x,y)=x+y$. Well, we usually want a few other conditions, but at the moment we don't even need those. So at this point, all we care about is that $x+y$ takes two elements from some set $X$, and gives another element of the same set $X$.
Now let's look at two sets $X$ and $Y$, both coming with their own $+_X$ and $+_Y$ (this includes the case that $X=Y$ and $+_X=+_Y$). Now as with any two sets, we can consider functions from $X$ to $Y$. However, there are certain functions that are special: Namely those functions $\phi:X\to Y$ which respect our additions. That is, $\phi(x+y)=\phi(x)+\phi(x)$. Now if we do this a lot, we may like to omit the parentheses wherever possible, that is, write the function application as product of the function and the argument. The most prominent example of this is operators in linear algebra. If we do that, the above law reads:
$$\phi(x+y) = \phi x + \phi y$$
Voila, a distributive law!
But wait, there is more: Given two functions $\phi:X\to Y$ and $\chi:X\to Y$, it is natural to ask about a function $x\mapsto \phi(x)+\chi(x)$. Now this gives a function operating on functions from $X\to Y$, that takes two functions and returns a new function. That new function is exactly the function that applies $\phi$ and $\chi$ to its argument and then adds that result. It is natural to consider that function on functions also as addition (the technical term is "pointwise addition"), and again denote it with $+$. So, using the product notation for function application, we have, by definition,
$$(\phi + \chi)x = \phi x + \chi x$$
Voila, another distributive law!
OK, so now where do the numbers come in? Well, let's now consider one of the additional requirements which I didn't talk about above: Namely we also require of an operation we want to call $+$ that it is associative, that is, $(a+b)+c = a+(b+c)$. This means that when you add up many things, you can basically omit the parentheses.
This in particular means that if you repeatedly add something to itself, like $b+b+b+\dots+b$, all that matters is how many $b$s you've got in that sum. Therefore we again introduce a new multiplication, this time with a positive integer $n$:
$$nb = \underbrace{b+b+\dots+b}_{n\text{terms}}$$
Note that this could also be seen as interpreting the integer $n$ as a function that takes an argument $b$ and returns a sum of $n$ $b$s. That is, we have functions
\begin{align}
1&:x\mapsto x\\
2&:x\mapsto x+x\\
3&:x\mapsto x+x+x\\
&\vdots
\end{align}
Now we can ask: Are these "number functions" functions that respect the addition structure? Well, let's try e.g. with $3$:
\begin{align}
3(x+y) &= (x+y)+(x+y)+(x+y) && \text{Definition of multiplication with $3$}\\
&= x+y+x+y+x+y && \text{because we require associativity}\\
&= x+x+x+y+y+y && \text{because we earlier required *commutativity* ($x+y=y+x$)}\\
&= 3x + 3y && \text{again, definition of multiplication}
\end{align}
The same works of course with any $n$ (a strict mathematical proof is slightly more involved). So we have:
$$n(x+y) = nx + ny$$
Again, a distributive law.
Now by interpreting numbers as functions as above, we get an addition, namely pointwise addition. And of course we have the normal number addition. But it is not hard to check that those two additions indeed give identical results, that is, they can be regarded as the same addition.
But for pointwise addition, we already know that there is a distributive law, which therefore carries over also to multiplications with numbers:
$$(m+n)x = mx + nx$$
OK, now let's consider the case where $X$ is actually a set of numbers itself, and $+$ is the normal addition of numbers. Then quite obviously, you recover the usual multiplication, and the corresponding distributive law.
So distribution laws occur quite naturally, in many contexts by requiring only that a binary operation exists without making any further requirements, and asking about operations that respect that operation, and for operations involving numbers, with only the additional requirement that addition is associative and commutative (two things we commonly demand from operations we call "addition").
Best Answer
If you mean, "for which real numbers $a$, $b$, and $c$ do we have $a+(bc) = (a+b)(a+c)$?" then you get $$a+bc = a^2 + ab+ac + bc,$$ or $a=a(a+b+c)$. This means that either $a=0$, or else $a+b+c=1$, which is probably what you were trying to say (though you forgot the possibility that $a=0$).
But this is not a "set of real numbers for which addition distributes over multiplication". Such a set would be a collection of real numbers $X$, such that for all $r,s,t\in X$ you have $r+st = (r+s)(r+t)$, which would require that given any three elements in the set, either the first one you picked is $0$, or else they add up to $1$. In particular, any number you pick, picking it three times, would have to be either $0$ or $\frac{1}{3}$; but you cannot have both $\frac{1}{3}$ and $0$, because then picking $\frac{1}{3}$ for $r$ and $s$, and $0$ for $t$, you would not have $r+(st) = (r+s)(r+t)$. So the only collections $X$ that satisfy that condition are $X=\{0\}$ and $X=\{\frac{1}{3}\}$. Not very interesting at all...
So it's probably better to think about what you are looking for as the collection of all $3$-tuples of real numbers $(a,b,c)$ such that $a+bc = (a+b)(a+c)$, which consists exactly of all $3$-tuples with either $a=0$ or $a+b+c=1$. Geometrically you get the union of two planes in $3$-space: the $yz$-plane, and the $x+y+z=1$ plane.
Nothing terribly exciting, I think, or particularly significant. But if you are interested in structures in which "addition" distributes over "multiplication" and vice-versa, then consider looking at boolean algebras, or more generally lattices, where $\wedge$ distributes over $\vee$ and vice-versa (like $\cap$ and $\cup$ do for sets, and the logical operators AND and OR do for logic).