If you mean, "for which real numbers $a$, $b$, and $c$ do we have $a+(bc) = (a+b)(a+c)$?" then you get
$$a+bc = a^2 + ab+ac + bc,$$
or $a=a(a+b+c)$. This means that either $a=0$, or else $a+b+c=1$, which is probably what you were trying to say (though you forgot the possibility that $a=0$).
But this is not a "set of real numbers for which addition distributes over multiplication". Such a set would be a collection of real numbers $X$, such that for all $r,s,t\in X$ you have $r+st = (r+s)(r+t)$, which would require that given any three elements in the set, either the first one you picked is $0$, or else they add up to $1$. In particular, any number you pick, picking it three times, would have to be either $0$ or $\frac{1}{3}$; but you cannot have both $\frac{1}{3}$ and $0$, because then picking $\frac{1}{3}$ for $r$ and $s$, and $0$ for $t$, you would not have $r+(st) = (r+s)(r+t)$. So the only collections $X$ that satisfy that condition are $X=\{0\}$ and $X=\{\frac{1}{3}\}$. Not very interesting at all...
So it's probably better to think about what you are looking for as the collection of all $3$-tuples of real numbers $(a,b,c)$ such that $a+bc = (a+b)(a+c)$, which consists exactly of all $3$-tuples with either $a=0$ or $a+b+c=1$. Geometrically you get the union of two planes in $3$-space: the $yz$-plane, and the $x+y+z=1$ plane.
Nothing terribly exciting, I think, or particularly significant. But if you are interested in structures in which "addition" distributes over "multiplication" and vice-versa, then consider looking at boolean algebras, or more generally lattices, where $\wedge$ distributes over $\vee$ and vice-versa (like $\cap$ and $\cup$ do for sets, and the logical operators AND and OR do for logic).
Let's start very fundamentally. A lot of mathematics is concerned about sets of objects and sets of functions mapping one or more of those objects to another one.
Now let's look at the case that we have a set $X$ and a function $f$ takes two elements of that set and gives a third one, that is, $f:X\times X\to X$. Note that this is just an arbitrary function acting on two elements of an arbitrary set. Now if we need such a function often, we like to write $f(x,y)$ in a slightly simpler form. This usually is something like $x*y$, $x\cdot y$ or simply $xy$, but especially if $f(x,y)=f(y,x)$ (and in some rare cases, even if not), it is not uncommon to write $f(x,y)=x+y$. Well, we usually want a few other conditions, but at the moment we don't even need those. So at this point, all we care about is that $x+y$ takes two elements from some set $X$, and gives another element of the same set $X$.
Now let's look at two sets $X$ and $Y$, both coming with their own $+_X$ and $+_Y$ (this includes the case that $X=Y$ and $+_X=+_Y$). Now as with any two sets, we can consider functions from $X$ to $Y$. However, there are certain functions that are special: Namely those functions $\phi:X\to Y$ which respect our additions. That is, $\phi(x+y)=\phi(x)+\phi(x)$. Now if we do this a lot, we may like to omit the parentheses wherever possible, that is, write the function application as product of the function and the argument. The most prominent example of this is operators in linear algebra. If we do that, the above law reads:
$$\phi(x+y) = \phi x + \phi y$$
Voila, a distributive law!
But wait, there is more: Given two functions $\phi:X\to Y$ and $\chi:X\to Y$, it is natural to ask about a function $x\mapsto \phi(x)+\chi(x)$. Now this gives a function operating on functions from $X\to Y$, that takes two functions and returns a new function. That new function is exactly the function that applies $\phi$ and $\chi$ to its argument and then adds that result. It is natural to consider that function on functions also as addition (the technical term is "pointwise addition"), and again denote it with $+$. So, using the product notation for function application, we have, by definition,
$$(\phi + \chi)x = \phi x + \chi x$$
Voila, another distributive law!
OK, so now where do the numbers come in? Well, let's now consider one of the additional requirements which I didn't talk about above: Namely we also require of an operation we want to call $+$ that it is associative, that is, $(a+b)+c = a+(b+c)$. This means that when you add up many things, you can basically omit the parentheses.
This in particular means that if you repeatedly add something to itself, like $b+b+b+\dots+b$, all that matters is how many $b$s you've got in that sum. Therefore we again introduce a new multiplication, this time with a positive integer $n$:
$$nb = \underbrace{b+b+\dots+b}_{n\text{terms}}$$
Note that this could also be seen as interpreting the integer $n$ as a function that takes an argument $b$ and returns a sum of $n$ $b$s. That is, we have functions
\begin{align}
1&:x\mapsto x\\
2&:x\mapsto x+x\\
3&:x\mapsto x+x+x\\
&\vdots
\end{align}
Now we can ask: Are these "number functions" functions that respect the addition structure? Well, let's try e.g. with $3$:
\begin{align}
3(x+y) &= (x+y)+(x+y)+(x+y) && \text{Definition of multiplication with $3$}\\
&= x+y+x+y+x+y && \text{because we require associativity}\\
&= x+x+x+y+y+y && \text{because we earlier required *commutativity* ($x+y=y+x$)}\\
&= 3x + 3y && \text{again, definition of multiplication}
\end{align}
The same works of course with any $n$ (a strict mathematical proof is slightly more involved). So we have:
$$n(x+y) = nx + ny$$
Again, a distributive law.
Now by interpreting numbers as functions as above, we get an addition, namely pointwise addition. And of course we have the normal number addition. But it is not hard to check that those two additions indeed give identical results, that is, they can be regarded as the same addition.
But for pointwise addition, we already know that there is a distributive law, which therefore carries over also to multiplications with numbers:
$$(m+n)x = mx + nx$$
OK, now let's consider the case where $X$ is actually a set of numbers itself, and $+$ is the normal addition of numbers. Then quite obviously, you recover the usual multiplication, and the corresponding distributive law.
So distribution laws occur quite naturally, in many contexts by requiring only that a binary operation exists without making any further requirements, and asking about operations that respect that operation, and for operations involving numbers, with only the additional requirement that addition is associative and commutative (two things we commonly demand from operations we call "addition").
Best Answer
Yes, let $S = \{x \in \mathbb{R} | x \ge 0\}$ and $f(a,b)=\sqrt{ab}$. Then $f(a,b) \cdot c = \sqrt{ab} \cdot c = \sqrt{ab} \cdot \sqrt{c^2} = \sqrt{abc^2} = \sqrt{(ac)(bc)} = f(ac, bc)$.
Edit: I found a few more functions which satisfy $c\cdot f(a,b) = f(ac,bc)$.
Credit to the post you linked in the comments: any linear function will work. I.e. for $S = \mathbb{R}$ with any real $\alpha, \beta$, we can define $f_{\alpha, \beta}(x,y)=\alpha x + \beta y$, and then we have $c \cdot (\alpha x + \beta y) = \alpha(cx) + \beta(cy)$. This gives us a plethora of corollaries (e.g. subtraction, which I didn't mention earlier due to triviality).
Also inspired by that post, the max and min functions satisfy the criterion over the nonnegative reals. This introduces us to a family of statistical functions that also satisfy the criterion over $n$-ary operations on the reals (proof left as an exercise):
This line of thinking helped me to realize that both the arithmetic and geometric means satisfy the distributive criterion, so I verified that the harmonic mean also satisfies for $S = \mathbb{R} \setminus \{0\}$, i.e.: \begin{equation} f(a,b) = \frac{1}{\frac{1}{a} + \frac{1}{b}} \\ f(ac,bc) = \frac{1}{\frac{1}{ac} + \frac{1}{bc}} = \frac{1}{\frac{1}{c} \left(\frac{1}{a} + \frac{1}{b}\right)} = \frac{c}{\frac{1}{a} + \frac{1}{b}} = c\cdot f(a,b) \end{equation}
We can similarly define functions $f'_{\alpha,\beta}(x,y) = f(\alpha x, \beta y)$ for any of the means, and multiplication is still distributive in the resulting functions. I suspect that this will hold for some additional cases of generalized means, and perhaps in general, but I haven't looked into this too deeply.
Aside: If we allow $0\in S$, then we get $c\cdot f(0,0) = f(0,0)$, so $f(0,0) = 0$ if $c$ is arbitrary. This was an interesting observation, but it did not yield any fruitful examples for me.