It suffices to show that in any closed convex set there is a unique closest point to $0$. if $0 \in E$ this is obvious so assume $0 \notin E$. If $d = \inf_{y \in E} |y|$, then $E \cap \{y \text{ }|\text{ }|y| \le d + 1\}$ is closed and bounded, hence compact. The norm on $\mathbb{R}^n$ is continuous, and so the norm attains a minimum is attained at some point $y$, and $|y| = d$, since $d \ge \min\{|y|, d+1\}$.
Let $y_1$, $y_2$ be two closest points to $0$; then the fact that $y_1$ and $y_2$ both have minimal norm and Cauchy-Schwarz imply the string of inequalities $$\left|{{y_1 + y_2}\over2}\right|^2 \ge |y_1|^2 = (|y_1|^2 + 2|y_1| \cdot |y_2| + |y_2|^2)/4 \ge (|y_1|^2 + 2y_1 \cdot y_2 + |y_2|^2)/4 = \left|{{y_1+y_2}\over2}\right|^2.$$Thus all inequalities above are equalities, and equality holds in Cauchy-Schwarz. Thus $y_1$ is a scalar multiple of $y_2$. We have$${{y_1 + y_2}\over2} \neq 0,$$ so $y_1 \neq -y_2$, whence $y_1 = y_2$.
No. Counterexamples include, for example, $L^p$ spaces, where $p \in (1, 2) \cup (2, \infty)$.
A normed linear space $X$ that satisfies this property if and only if every non-empty, closed, convex subset of $X$ is Chebyshev. That is, given any $x \in X$, and non-empty, closed, convex subset $C$ of $X$, there is a unique point $c \in C$ such that
$$\|x - c\| = d(x, C) = \inf_{y \in C} \|x - y\|.$$
Note that, taking $x = 0$, this implies that the norm achieves a unique minimum over $C$.
Conversely, if $X$ satisfies the property you want, and $C \subseteq X$ is non-empty, closed, and convex, then given $x \in X$, the norm should achieve a unique minimum on the set $C - x$; let this minimising point be $y$. Then, $y = c - x$ for some $c \in C$, and
$$\|c - x\| \le \|d - x\|$$
for all $d \in C$, i.e. $c$ is the unique closest point to $x$ in $C$, i.e. $C$ is Chebyshev.
Now, $X$ has this property if and only if $X$ is reflexive and strictly convex. In fact, the convex subsets of $X$ will have at least one nearest point if and only if $X$ is reflexive, and will have at most one nearest point if and only if $X$ is strictly convex.
To see why this is, suppose $X$ is reflexive. Then $B_X$ is weakly compact, as is any closed, bounded, convex set. We can express the set of nearest points of $x$ to $C$ as
$$\bigcap_{\varepsilon > 0} B[x; d(x,C) + \varepsilon],$$
where each term in the intersection is weakly compact, and hence the intersection is non-empty. That is, reflexivity implies the existence of nearest points.
Inversely, if $X$ is not reflexive, then James' theorem guarantees us a bounded linear functional $f$ that fails to achieve a maximum on the closed unit ball. It's not hard to see that $0$ has no nearest point on $f^{-1}[1, \infty)$ (and in fact, no point outside the set has a nearest point inside the set).
If $f$ is strictly convex, then the unit ball has no proper non-trivial faces, or equivalently, the only convex subsets of the unit sphere are the empty set and singletons. We can express the set of nearest points to $x$ from a set $C$ by
$$B[x; d(x, C)] \cap C = S[x; d(x, C)] \cap C$$
If $C$ is convex, the left side is convex, but is a subset of the sphere, and hence must be a singleton or the empty set. That is, convex sets admit at most one nearest point.
Inversely, if $X$ is not strictly convex, then we can find a hyperplane that supports the unit ball on at least a line segment. Then, for such a hyperplane, the origin projects onto at least that line segment, which means convex sets may admit more than one nearest point, completing the full proof.
Best Answer
How about $\mathbb R^2$ with $\|(x,y)\|=|x|+|y|,\ A=\{(1-t)(1,0)+t(0,1): 0\le t\le 1\}$ and $\vec x=(0,0).$