Sometimes you just have to deal with more complicated notation. Here, I shall use a capital letter like $A$ to denote an element of $X^*$; and since $A$ is an equivalence class of sequences in $X$, I shall use something like $(a_n)$ to denote an element of $A$.
Here are some preliminary observations which will help to simplify the proof of completeness:
If $(a_n)_{n=1}^{\infty}$ is a Cauchy sequence in $X$ then every subsequence $(a_{n_k})_{k=1}^{\infty}$ is related via $\sim$ to the main sequence (I leave it to you to verify).
For any $\eta>0$, there is an $N\in \Bbb{N}$ such that for all $m,n\geq N$, we have $d(a_n,a_m) \leq \eta$. In other words, the subsequence $\{a_N, a_{N+1}, \dots\}$ has the property that each pair of terms is at most $\eta$ away from each other. To summarize: for every $\eta>0$, there is a subsequence $(a_{n_k})_{k=1}^{\infty}$ such that for all $k,l\in \Bbb{N}$, we have $d(a_{n_k}, a_{n_l})< \eta$.
Fix a sequence $(\zeta_n)$ of positive numbers which decrease to zero (such as $\zeta_n = \frac{1}{n}$). Now, to show completeness of $X^*$, we have to show every Cauchy sequence converges. So, let $(A_n)_{n=1}^{\infty}$ be a Cauchy sequence in $X^*$. For each $n\in \Bbb{N}$, choose a representative $(a_{n,k})_{k=1}^{\infty} \in A_n$ such that for all $k,l\in \Bbb{N}$, we have
\begin{align}
d(a_{n,k}, a_{n,l})< \zeta_n \tag{i}
\end{align}
Note that such a representative always exists by our remarks above.
Since $(A_n)$ is Cauchy, for every $j\in \Bbb{N}$, there is an $N_j\in \Bbb{N}$ such that for all $n,m,k \geq N_j$, we have
\begin{align}
d(a_{n,k}, a_{m,k}) < \zeta_j \tag{ii}
\end{align}
(just unwind the definition of $(A_n)$ being Cauchy and the definition of $\delta$ to see why this follows). Also observe that by doing this recursively, then you can arrange for this such that $j<N_j$ and $N_1< N_2< N_3\dots$
Now, put $\beta_j:= a_{N_j, N_j}$. We claim that $(\beta_j)_{j=1}^{\infty}$ is Cauchy in $X$. This is because for every $l\in \Bbb{N}$, if $i,j\geq l$ then (since $N_i,N_j \geq N_l$)
\begin{align}
d(\beta_i, \beta_j) &:= d(a_{N_i, N_i}, a_{N_j, N_j}) \\
&\leq d(a_{N_i, N_i}, a_{N_i, N_j}) + d(a_{N_i, N_j}, a_{N_j, N_j}) \\
&\leq \zeta_i + \zeta_l \tag{by i and ii} \\
&\leq 2\zeta_l,
\end{align}
where the last line is because we chose the $\zeta$ sequence to be decreasing. As $l\to \infty$, the RHS tends to $0$, which proves the sequence is Cauchy in $X$.
Finally, let $B:= [(\beta_j)_{j=1}^{\infty}]$ be the equivalence class; I leave it to you to show $A_n \to B$.
Note that the idea of the proof is pretty simple. We have a Cauchy sequence $(A_n)$. We then choose representatives $(a_{n,k})_{k=1}^{\infty}$. So if we write this out as a square array of numbers:
\begin{align}
\begin{matrix}
A_1: & a_{1,1} & a_{1,2} & a_{1,3} & \cdots \\
A_2: & a_{2,1} & a_{2,2} & a_{2,3} & \cdots \\
A_3: & a_{3,1} & a_{3,2} & a_{3,3} & \cdots \\
\vdots & \vdots & \vdots & \vdots & \ddots
\end{matrix}
\end{align}
($n$ is going down, $k$ is going to the right). Very informally, (i) says that if you go vertically down far enough, then all the elements in that row will be close to each other. (ii) says that if you go to the far enough to the "bottom right" then all the elements in the same column will be close enough.
So, the idea is to take the diagonal elements $\beta_j := a_{N_j,N_j}$, and show that this has the desired properties. To really understand the proof, I would highly recommend you to write your own arguments for why $(\beta_j)$ is Cauchy and why $A_n \to B$; use the square array above as your guiding principle to see which elements are close to which ones.
After pondering about Zhen Lin's comment, it seems to me that this only a theoretical problem.
If we only know that $\mathcal C$ has "things" described via a universal property, then we get indeed a choice problem. Since the objects of $\mathcal C$ in general form a proper class, we would need a version of the axiom of choice for classes. My knowledge about set theory is limited, but I have never seen such a general choice axiom and can't judge which problems it could cause.
The general pattern of a universal property is this: We are given a diagram $\Delta$ in $\mathcal C$ consisting of a family of objects $(X_i)_{i \in I}$ and a family of morphims between these $X_i$. Then we consider diagram extensions $\Delta^*$ obtained from $\Delta$ by adding a single object $X(\Delta^*)$ and morphisms between $X(\Delta^*)$ and the $X_i$ such that some characteristic condition $\mathfrak P$ is satisfied. We consider two types of diagram extensions: Sink extensions in which all added morphisms have codomain $X(\Delta^*)$, i.e. have the form $f_i^{\Delta^*} : X_i \to X(\Delta^*)$, and source extensions in which all added morphisms have domain $X(\Delta^*)$, i.e. have the form $f_i^{\Delta^*} : X(\Delta^*) \to X_i$.
Let us focus on sink extensions; everything can of course be formulated "dually" for source extensions.
A morphism from a sink extension $\Delta_1^*$ to a sink extension $\Delta_2^*$ is a morphism $\phi: X(\Delta_1^*) \to X(\Delta_2^*)$ such that $f_i^{\Delta_2^*} \circ \phi = f_i^{\Delta_1^*}$ for all $i \in I$.
A sink extension $\Delta_u^*$ is called a universal sink extension of characteristic $\mathfrak P$ if for each source extension $\Delta^*$ there exists a unique morphism $\phi : X(\Delta^*) \to X(\Delta_u^*)$.
Equalizers, kernels, products and limits are examples of universal sink extensions; coequalizers, cokernels, coproducts and colimits are examples of universal source extensions. So perhaps we should call a source extension a cosink extension.
How do we know that a diagram $\Delta$ in a category $\mathcal C$ has a universal sink or source extension?
Certainly we have to give a proof, and this requires an explicit construction assigning to $\Delta$ an object $X_u(\Delta)$ and morphisms between $X_u(\Delta)$ and the $X_i$. Such a constructive existence proof usually will not involve any choices; it works like a (deterministic) algorithm.
In other words, we get a function assigning to an input consisting of a diagram $\Delta$ an output consisting of a specific universal extension $\Delta^*_u$.
Therefore, if we have some category $\mathcal C^*$ of diagrams in $\mathcal C$ with universal universal sink or source extensions of characteristic $\mathfrak P$, we get a functor
$$\mathfrak P^* : \mathcal C^* \to \mathcal C^s$$
where $\mathcal C^s$ denotes the category of sink or source diagrams in $\mathcal C$.
Best Answer
The purpose of defining a Cauchy sequence is that the property is (a) preserved under isometric maps, meaning that a sequence remains Cauchy if you extend or restrict the ambient metric space, and thus (b) is “intrinsic” to the sequence.
Note: By an “isometric map”, I mean a map $f$ from a metric space $(X, d)$ to a metric space $(Y, \rho)$ such that $\rho(f(x_1), f(x_2)) = d(x_1, x_2)$ for all $x_1, x_2 \in X$. An isometric map describes how $(X, d)$ exists as a metric space inside of $(Y, \rho)$.
Here’s a fact you can check, and I hope it answers your question.
The forward direction is just the completion of a metric space. The backward direction follows because if $(\iota(x_n))$ is convergent, then it’s Cauchy, and so $(x_n)$ is also Cauchy.
In other words, a sequence is Cauchy iff there’s some extension of the metric space that makes the sequence convergent. A Cauchy sequence can be made convergent, but a non-Cauchy sequence can never be made convergent by extending the metric space.