Here's the start of the answer. Let $\displaystyle \left\lbrace ( x^{(j)}_n )_{n = 1}^{\infty} \right\rbrace_{j = 1}^{\infty}$ be the Cauchy sequence of $L^1 (\mathbb{N})$ (usually called $\ell^1$) sequences, so that
$$\forall \epsilon > 0, \exists \, J :\, j, k \geq J \text{ implies } \Vert(x^{(j)}_n) - (x^{(k)}_n)\Vert_1 = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \epsilon.$$
Claim: We know what the limit sequence must be, termwise.
Proof: Fixing a specific index $m$, we clearly have that $\vert x^{(j)}_m - x^{(k)}_m \vert \leq \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert$, so that we conclude:
$$\forall \epsilon > 0, \exists \, J :\, j, k \geq J \text{ implies } \vert x^{(j)}_m - x^{(k)}_m \vert \leq \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \epsilon.$$
Hence, $(x_m^{(j)})_{j = 1}^{\infty}$ is a Cauchy sequence of complex numbers, hence converges. We define the limiting value as
$$x^{(\infty)}_m : = \lim_{j \to \infty} x_m^{(j)}.$$
Of course, we can repeat for each $m \in \mathbb{N}$. Therefore, our goal is to show that the termwise/"pointwise" limit $(x_n^{(\infty)})_{n = 1}^{\infty}$ is an $L^1(\mathbb{N})$ sequence. EDIT Also, we need to show that that $\lbrace (x^{(j)}) \rbrace_{j = 1}^{\infty}$ converges to it. TIDE $\square$
The remainder is a fairly standard "split the error into digestible pieces each less than $\displaystyle \frac{\epsilon}{2}$ or $\displaystyle \frac{\epsilon}{3}$ or whatever the fraction is," similar to the proof that the uniform limit of continuous functions is continuous. EDIT This paragraph is too simplistic. See below. TIDE Do you require further assistance, or can you take it from here?
P.S.: This method does not really generalize that well outside sequence spaces. The general proof of completeness of $L^1(\mathbb{R})$ requires some use of the Dominated Convergence Theorem, and the fact that completeness is equivalent to "absolute convergence of series implies convergence of series."
EDIT: The middle of the proof. Our goal here is to show that the "limit object" is in the space of $\ell^1$ sequences.
Lemma 1: $M: = \sup_j \Vert (x_n^{(j)}) \Vert_1 < \infty$. (i.e., "Cauchy sequences are bounded").
Proof: Fix $\epsilon = 1$. Then by Cauchy-ness, $$\exists J: j, k \geq J \, \text{ implies } \, \Vert (x_n^{(j)}) - (x_n^{(k)}) \Vert_1 < 1.$$
Fixing $j = J$, then we have that for all $k \geq J$,
$$ \begin{align} \Vert (x^{(k)}_n) \Vert_1 & \leq \Vert (x_n^{(k)}) - (x_n^{(J)}) \Vert_1 + \Vert (x_n^{(J)}) \Vert_1 \\
& \leq 1 + \Vert (x_n^{(J)}) \Vert_1 \end{align}$$
Thus, it is clear that the larger of $\displaystyle \max_{1 \leq k < J} \Vert (x_n^{(k)}) \Vert_1$ and $1 + \Vert (x_n^{(J)}) \Vert_1 $ is an upper bound on all $ \Vert (x_n^{(k)}) \Vert_1.$ Hence, the supremum is finite. $\square$
Our next trick is to show that all the partial sums are uniformly bounded.
Claim: For all $N \in \mathbb{N}$,
$$S_N := \sum_{n = 1}^N \vert x_n^{(\infty)} \vert \leq M + 1.$$
Proof: Fix $N \in \mathbb{N}$. By the Triangle Inequality, for any choice of $j$,
$$ \sum_{n = 1}^N \vert x_n^{(\infty)}\vert \leq \sum_{n = 1}^N \vert x_n^{(\infty)} - x_n^{(j)} \vert + \sum_{n = 1}^N \vert x_n^{(j)}\vert.$$
We need to choose $j$ so that both sums are bounded.
The first sum: Note that for all $n \in \lbrace 1, 2, 3, \dotsc, N \rbrace$, we know that $\displaystyle x^{(\infty)}_n = \lim_{j \to \infty} x_n^{(j)}$. Hence, fixing $\displaystyle \epsilon = \frac{1}{N}$,
$$ \exists J_n : j \geq J_n \text{ implies that } \vert x_n^{(\infty)} - x_n^{(j)} \vert < \frac{1}{N}.$$
Taking $\displaystyle J^* := \max_{1 \leq n \leq N} J_n$, we have that
$$
j \geq J^* \text{ implies } \sum_{n = 1}^N \vert x_n^{(\infty)} - x_n^{(j)} \vert < \sum_{n = 1}^N \frac{1}{N} = 1.
$$
We will, in fact, for convenience take $j = J^*$, and the first sum is bounded by $1$.
The second sum: Clearly
$$\sum_{n = 1}^N \vert x_n^{(J^*)} \vert \leq \sum_{n = 1}^{\infty} \vert x_n^{(J^*)} \vert = \Vert (x_n^{(J^*)}) \Vert_1.$$ Yet this sum, by Lemma 1 is bounded by $M$.
Combining the above results, we get that $S_N \leq 1 + M$, as required. $\square$
Claim: $(x_n^{(\infty)})$ is in $L^1(\mathbb{N})$.
Proof Sketch: Clearly $S_N$ is increasing in $N$, yet bounded above independently of $N$. As soon as you see that $S_N$ converges, then by definition of infinite sums of real numbers, $\sum_{n = 1}^{\infty} \vert x_n^{(\infty)} \vert$ exists and equals $\displaystyle \lim_{N \to \infty} S_N$. So $(x_n^{(\infty)})$ will be in $L^1(\mathbb{N})$.
This just boils down to showing that $\displaystyle \lim_{N \to \infty} S_N = \sup_N S_N ( < \infty)$, which is a basic exercise in limits of real numbers. I will not go into details here unless requested. $\square$
Therefore, the desired ``limit object'' is in the space!
Now we must check that $\lbrace (x_n^{(j)}) \rbrace_{j = 1}^{\infty}$ actually converges to it. To do so, we need another lemma, showing that as $j$ gets large, the "tail" of the sums are uniformly small.
Lemma 2: Fix $\epsilon > 0$, and fix $J = J(\epsilon)$ such that
$$j, k \geq J(\epsilon) \text{ implies } \Vert(x^{(j)}_n) - (x^{(k)}_n)\Vert_1 = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \frac{\epsilon}{2}.$$
Then there exists $N = N(\epsilon, J(\epsilon))$ such that for all $j \geq J(\epsilon)$,
\begin{equation}\sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert < \epsilon. \tag{*}\end{equation}
Proof: We prove by contradiction. Suppose that the above conclusion fails. Then for all $N$, there exists $k \geq J(\epsilon)$ such that $\sum_{n = N + 1}^{\infty} \vert x^{(k)}_n \vert \geq \epsilon$.
Now, for $j = J(\epsilon)$, $\vert (x^{(J(\epsilon))}) \vert$ is in $L^1(\mathbb{N})$, so there exists $N_1$ such that $$\sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n \vert < \frac{\epsilon}{2}.$$
Yet by the failure of the above condition, letting $N = N_1$, there exists $k \geq J(\epsilon)$ such that
$$\sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert \geq \epsilon$$
Therefore, we see that
$$\begin{align}
\sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n - x^{(k)}_n \vert &\geq \sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert - \vert x^{(J(\epsilon))}_n \vert \\
& = \sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert - \sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon)}_n \vert\\
& > \epsilon - \frac{\epsilon}{2} = \frac{\epsilon}{2}.
\end{align}$$
Yet
$$\sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon)}_n - x^{(k)}_n \vert \leq \sum_{n = 1}^{\infty} \vert x^{(J(\epsilon)}_n - x^{(k)}_n \vert \, = \Vert (x^{(J(\epsilon)}_n) - (x^{(k)}_n) \Vert_1,$$
and by definition of $J(\epsilon)$, since $j = J(\epsilon)$ and $k$ are at least $J(\epsilon)$, $$\Vert (x^{(J(\epsilon))}_n) - (x^{(k)}_n) \Vert_1 < \frac{\epsilon}{2}.$$
So in short,
$$\frac{\epsilon}{2} < \sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n - x^{(k)}_n \vert < \frac{\epsilon}{2}.$$
Contradiction. $\square$
[Recommendation: Now you are in a position to do an $\displaystyle \frac{\epsilon}{2}$-style proof. Try doing it yourself before finishing the answer.]
Claim: $(x_n^{(j)})$ converges to $(x_n^{(\infty)})$.
Proof: By Lemma 2, with $\displaystyle \frac{\epsilon}{4}$ in the place of $\epsilon$, there exists $N_1$ such that for all $j \geq J(\epsilon /4)$,
$$\sum_{n = N_1 + 1}^{\infty} \vert x^{(j)}_n \vert < \frac{\epsilon}{4}.$$
Since $(x_n^{(\infty)})$ is in $L^1$, there exists $N_2$ such that $$\sum_{n = N_2 + 1}^{\infty} \vert x^{(\infty)}_n \vert < \frac{\epsilon}{4}.$$
Therefore, for all $j \geq J(\epsilon/4)$, and for $N = \max \lbrace N_1, N_2 \rbrace$
$$\begin{align}
\sum_{n = N + 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert &\leq \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert + \vert x^{(\infty)}_n \vert \\
& = \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert + \sum_{n = N + 1}^{\infty} \vert x^{(\infty)}_n \vert\\
& \leq \sum_{n = N_1 + 1}^{\infty} \vert x^{(j)}_n \vert + \sum_{n = N_2 + 1}^{\infty} \vert x^{(\infty)}_n \vert\\
& < \frac{\epsilon}{4} + \frac{\epsilon}{4} = \frac{\epsilon}{2}
\end{align}$$
Moreover, since $x_n^{(j)} \to x_n^{(\infty)}$ for each fixed $n$ as $j \to \infty$, by similar logic to the "The first sum" section above, there exists $J_2$ such that $j \geq J_2$ implies that
$$\sum_{n = 1}^N \vert x^{(j)}_n - x^{(\infty)}_n \vert < \frac{\epsilon}{2}.$$
Therefore, for $j \geq J^* = \max \lbrace J(\epsilon/4), J_2 \rbrace$,
$$\begin{align}
\Vert (x^{(j)}) - (x^{(\infty)}) \Vert_1 & = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert \\
& = \sum_{n = 1}^{N} \vert x^{(j)}_n - x^{(\infty)}_n \vert + \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert \\
& < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon.
\end{align}$$
This works for all $\epsilon > 0$, so we have convergence. $\square$.
Best Answer
You are definitely on the right track and nearing a complete proof! I'm going to write $\mathcal{A}_p$ for what you're denoting by $\sum_p A_n$ (since I find the placement of the $p$ rather unsettling in your notation
;)
).If you're comfortable with the idea of a direct product of vector spaces, you can think of $\mathcal{A}_p$ (before assigning it a topology) as a vector subspace of $\prod_{n=1}^\infty A_n$, specifically, the subspace of all sequences (think "infinite tuples", if you prefer) $x = (a_k)$ such that $a_k \in A_k$ for all $k \in \mathbb{N}$ and $\|x\|_p := \sum_{n=1}^\infty \|a_n\|_n^p < \infty$. Here $\|\cdot\|_n$ denotes a norm defining the Banach topology on $A_n$ for each $n$. One checks (as you should, if you haven't already) that $\|\cdot\|_p$ defines a norm on $\mathcal{A}_p$, and then, as you've observed, it is enough to show that $\mathcal{A}_p$ is sequentially complete.
Your argument for sequential completeness is sound, although it might be cleaner to name the sequences $(a_{j,n})$. Indeed, you need to consider Cauchy sequences $(x_n)$ whose entries are themselves in $\mathcal{A}_p$. Hence you're correct that each $x_n$ should take the form $x_n = (a_{n,k})_{k=1}^\infty$ for some choice of $a_{n,k} \in A_k$ ($k \in \mathbb{N}$). That $(x_n)$ is Cauchy in $\mathcal{A}_p$ implies for all $\epsilon > 0$ there exists $N_\epsilon \in \mathbb{N}$ such that if $m,n \in \mathbb{N}$ with $m,n \geq N_\epsilon$ then $\|x_m - x_n\|_p < \epsilon$. Expanding the definition as you have shows that
$$ \sum_{k=1}^\infty \|x_{m,k} - x_{n,k}\|_k^p = \|x_m - x_n\|_p^p < \epsilon^p, $$
whence, as the leftmost series contains only non-negative summands, it must be that $\|x_{m,k}-x_{n,k}\|_k^p < \epsilon^p$ for all $k$. It follows that for all $k \in \mathbb{N}$, for all $\epsilon > 0$ there exists $N_\epsilon$ such that $\|x_{m,k} - x_{n,k}\|_k < \epsilon$ whenever $m,n \geq N_\epsilon$, whence, by definition, $(x_{n,k})_{n=1}^\infty$ is a Cauchy sequence in $A_k$.
Ok, so far I've just written out what you've already shown, but with a few points of clarification and some cleaner language. Going forward, I'll leave you with hints/a roadmap of how to reach the desired conclusion. So, using what we have done so far, how can we conclude from the above that $(x_n)$ has a limit $L$ in $\mathcal{A}_p$? Well, $L$ will have to be of the form $(L_k)_{k=1}^\infty$ with $L_k \in A_k$ for all $k$, and we have a good candidate for what each $L_k$ should be based on the above arguments (what is it?). Once you know that, you need to answer two more questions before you know $\mathcal{A}_p$ is complete:
What you need to show to answer both of these questions is a direct computation using the norm $\|\cdot\|_p$ and familiar arguments from analysis. I can be of further assistance if needed. Good luck!
Edit in Response to Your Comment: You've now shown that $L = \lim_{n\to\infty}x_n$ should be of the form $L = (L_k)$ where for each $k \in \mathbb N$ the $k$-th entry $L_k$ is the limit of the Cauchy sequence $(x_{n,k})_{n=1}^\infty$ in $A_k$. One way of doing this, which essentially knocks out both of the steps (1. and 2. above) that I provided in my original response is the following.
First, we may exploit the relationship between $L$ and the $x_n$: While we don't know the value of $\|L\|_p$ is finite right off the bat, we should expect that (1) $\|L\|_p$ should be intimately related to $\|x_n\|_p$ for large $n$ and (2) $\|x_n\|_p$ is finite for all $n$. So we can try to force $x_n$ to appear in the expression $\|L\|_p$ (or, for convenience, $\|L\|_p^p$) in the usual way: adding zero. Since $\|\cdot\|_p$ is a norm, it satisfies the triangle inequality, so $\|L\|_p \leq \|L-x_n\|_p + \|x_n\|_p$. Thus, since $\|x_n\|_p$ is finite for every value of $n \in \mathbb{N}$, the norm $\|L\|_p < \infty$ if we can show that $\|L-x_n\|_p$ for a sufficiently large choice of $n$.
Second, we may reduce to proving $\|L-x_n\|_p \to 0$ as $n \to \infty$: Note that $\|L - x_n\|_p$ being finite for a sufficiently large choice of $n$ follows immediately if $\|L-x_n\|_p \to 0$ as $n \to \infty$. Hence the latter assertion actually implies both 1. and 2. (really, it confirms that $L \in \mathcal{A}_p$ and then, knowing that, also verifies it is the limit of $(x_n)$).
Third, and this is the trick, to show that this limit is $0$, we need to bear in mind what might be best summarized as the Cauchy-ness of $(x_{n,k})_{n=1}^\infty$ in $A_k$ is independent of the Cauchy-ness of $(x_{n,l})_{n=1}^\infty$ in $A_l$ for $k \neq l$: In other hand-wavy words, the $\epsilon$ and $N_\epsilon$ occuring the definition of a Cauchy sequence (and the limit of a sequence) can be defined separately for each $k$. Thus, for instance, for all $k \in \mathbb{N}$, for all $\epsilon_k > 0$, there exists an $N_{k,\epsilon_k} \in \mathbb{N}$ such that for all $m,n \geq N_{k,\epsilon_k}$ we have $\|x_{m,k}-x_{n,k}\|_k < \epsilon_k$. In fact, we may assume for all $k$, for all $\epsilon_k > 0$, and for all $m,n \geq N_{k,\epsilon_k}$ that both $\|x_{m,k}-x_{n,k}\|_k < \epsilon_k$ and (since $x_{n,k} \to L_k$) $\|L_k-x_{n,k}\|_k < \epsilon_k$.
Finally, employ the triangle inequality for $\|\cdot\|_k$ for each $k$, separately: Why all the added notation in the third bullet? Well, note that for any choice of $n$ and for any choice of $m_k \in \mathbb{N}$ (for each $k$) that
$$ \|L-x_n\|_p = \sum_{k=1}^\infty \|L_k-x_{n,k}\|_k \leq \sum_{k=1}^\infty \left(\|L_k - x_{m_k,k}\|_k+\|x_{m_k,k}-x_{n,k}\|_k\right). $$
Again this follows by the triangle inequality (but now for each $\|\cdot\|_k$). So the idea is that if $n$ and $m_k$ can be chosen such that $n,m_k \geq N_{k,\epsilon_k}$ for all $k \in \mathbb{N}$, then this shows that
$$ \|L-x_n\|_p \leq \sum_{k=1}^\infty 2\epsilon_k. $$
Hence, for a given $\epsilon > 0$, if we select $\epsilon_k$ for each $k \in \mathbb{N}$ in such a way that $\sum_{k=1}^\infty 2\epsilon_k = \epsilon/2$, then we have shown that $\|L-x_n\|_p \leq \epsilon/2 < \epsilon$ for all $n \geq \sup_{k \in \mathbb{N}}N_{k,\epsilon_k}$, completing the proof.
So I wound up writing more of the proof than I intended, but I think it will help show you how you can reason through an argument like this. There are still a couple small but important points that I've left out that need to be addressed. Specifically,
For each $\epsilon > 0$, what is a good choice of $(\epsilon_k)_{k=1}^\infty$ satisfying $\sum_{k=1}^\infty 2\epsilon_k = \epsilon/2$? There is a natural way to do this using geometric series. (Hint: Start with $\epsilon = 1$)
Given these $\epsilon_k$, why is it true that I choose $N_{k,\epsilon_k}$ for each $k$ in such a way that $\sup_{k \in \mathbb{N}}N_{k,\epsilon_k}$ is finite? (Hint: Look carefully at and elaborate on the argument showing that $(x_n)$ Cauchy in $\mathcal{A}_p$ implies $(x_{n,k})_{n=1}^\infty$ Cauchy in $A_k$ for each $k$.)
Hope this clarifies things!