Let $f(x) = \sum_i w_i (c - x_i) \cdot (c - x_i)$. Then the partial derivative of $f$ wrt $c_j$ is
$$
2\sum_i w_i (c - x_i)\cdot e_j
$$
where $e_j$ is the $j$th standard basis vector. Setting this to zero gives
$$
\sum_i w_i (c_j - x_{i,j}) = 0 \\
c_j\sum_i w_i = \sum_i w_i x_{i,j} \\
c_j= \frac{\sum_i w_i x_{i,j}}{\sum_i w_i }
$$
where $x_{i, j}$ denotes the $j$th entry of vector $x_i$.
That shows that there's a unique critical point. The function clearly goes to infinity as $\|c\|$ gets large, so this critical point must be a min. You're done.
Since the symbol $\mu$ is already in use in the question,
let's write $\bar x$ to denote the mean of the vectors in $S$; that is,
$$
\bar x = \frac1n \sum_{x\in S} x.
$$
Then by the linearity of the inner product,
\begin{align}
\sum_{x\in S} \langle x, \mu\rangle
&= \left\langle \sum_{x\in S} x, \mu\right\rangle \\
&= n \left\langle \frac1n\sum_{x\in S} x, \mu\right\rangle \\
&= n \left\langle \bar x, \mu\right\rangle. \\
\end{align}
With this, you can eliminate the individual $x$s from your last
formula, leaving only $\mu$ and $\bar x$.
Now consider the quantity $\lVert \mu - \bar x\rVert_2^2$.
That's something that clearly minimized when $\mu = \bar x$,
since the norm $\lVert \cdot\rVert_2$ can never be less than zero.
That is,
$$
\arg\min_\mu \, \lVert \mu - \bar x\rVert_2^2 = \bar x \tag1
$$
So it would be really convenient if we could reduce your minimization
problem to something that looks like the left-hand side of Equation $(1)$.
Now consider some of the techniques you already used in your first attempt.
You know that
$\lVert \mu - \bar x\rVert_2^2 =
\lVert\mu\rVert_2^2 - 2 \langle\mu,\bar x\rangle + \lVert\bar x\rVert_2^2$,
and you know you can add or subtract a constant from the value inside the
$\arg\min$ without changing the $\mu$ that minimizes the value.
Also notice that in your second attempt, you found that
$$
\mu^* = \arg\min_\mu \, (- 2 \langle\bar x,\mu\rangle + \lVert\mu\rVert_2^2).
$$
At this point, you're just a couple of steps away from showing
that $\mu^* = \bar x$.
(I'm trying not to spoil this too much, because it's so much fun
when a problem resolves like this, especially when you get to make
the final "aha!" step yourself.)
Best Answer
What minimizes your weighted is again the median !!! But not the median of $x_1,x_2,... $.
A value that minimizes your sum will be a number $\mu $ such that
$$ \sum_{i\in \{1,2,...,n\}, x_i>\mu} w_i < \frac {\sum_{i=1}^n w_i}{2} $$
$$ \sum_{ i\in \{1,2,...,n\},\,\, x_i<\mu} w_i < \frac {\sum_{i=1}^n w_i}{2} $$
Note that this $\mu $ will also a median. It is a median of the random variable defined on the sample space ${1,2,...,n} $ that sends $i $ to $x_i $. In this probability space, the probability of occurrence of $i $ is $\frac {w_i} {w_1+...+w_n} $
Opps! ! I just noted that your $x_i $ can be vectors, therefore my answer only works when $d=1$