$\newcommand{\bra}[1]{\langle{#1}\rvert}
\newcommand{\ket}[1]{\lvert{#1}\rangle}
\newcommand{\braket}[2]{\langle{#1}|{#2}\rangle}
\newcommand{\N}{\mathbb{N}}
\DeclareMathOperator{\tr}{tr}$For the sake of brevity I'll write the basis as $\{\ket{i}\}_{i\in\N}$.
If $p=\sum_{i\in\N}a_i\ket{i}\bra{i}$ and $s=\sum_{i\in\N}b_i\ket{i}\bra{i}$
then their product is (you have to use two different indices!)
\begin{equation}
ps=\sum_{i\in\N}a_i\ket{i}\bra{i}\sum_{j\in\N}b_j\ket{j}\bra{j}=
\sum_{i\in\N}\sum_{j\in\N}a_ib_j\ket{i}\braket{i}{j}\bra{j}=
\sum_{i\in\N}\sum_{j\in\N}a_ib_j\delta_{ij}\ket{i}\bra{j}=
\sum_{i\in\N}a_ib_i\ket{i}\bra{i}
\end{equation}
so its trace is
\begin{equation}
\tr(ps)=\sum_{k\in\N}\bra{k}ps\ket{k}=
\sum_{k\in\N}\sum_{i\in\N}a_ib_i\braket{k}{i}\braket{i}{k}=
\sum_{k\in\N}\sum_{i\in\N}a_ib_i\delta_{ki}\delta_{ik}=
\sum_{k\in\N}a_kb_k
\end{equation}
then, since $a_i\le 1$ and $b_i\le 1$ you have $a_ib_i\le a_i$ $\forall i\in\N$, therefore
\begin{equation}
\tr(ps)=\sum_{k\in\N}a_kb_k\le\sum_{k\in\N}a_k=\tr(p)=1.
\end{equation}
The numbers in notations like $|n\rangle$ are the analogues of indices in matrix notation. That is, $|0\rangle=e_0$, $|1\rangle=e_1$, etc., where $e_n$ is the vector which has a $1$ in the $n$th position and $0$ in the other entries. Unfortunately, this notation is unspecific about the dimension of the base space. For qubits in quantum computers, the dimension is $2$, so we only have $|0\rangle=e_0=(1,0)$ and $|1\rangle=e_1=(0,1)$. It is also common to have a countable infinity of basis vectors, so we get $|n\rangle$ for each $n\in\Bbb N$. In quantum mechanics one also deals with this notation for larger dimensional spaces; for example we may have $|x\rangle$ for each $x\in\Bbb R^3$ (the position basis), which is a vector space of uncountable dimension $|\Bbb R^3|=2^{\aleph_0}$.
In any case, these vectors are usually enumerating a basis of some kind, and the details beyond that depend on the context.
The notation $\langle 0|0\rangle$ is written in linear algebra notation as $e_0^Te_0$, which is a $1\times 1$ matrix whose value can be identified with the dot product $e_0\cdot e_0$. Provided that the vector is normalized, this will always be $1$. So a general answer is $\langle m|n\rangle=0$ if $m\ne n$, and $\langle n|n\rangle=1$, which expresses that the vectors $(|n\rangle)_{n\in\Bbb N}$ are an orthonormal basis for the space.
For some general rules, then, we have $|n\rangle=e_n$ and $\langle n|=e_n^T$ (or $e_n^\dagger$ in complex vector spaces), where we understand the first as a $d\times 1$ matrix so that the second is $1\times d$, where $d$ is the dimension of the space. Then the inner product is $\langle m|n\rangle=e_m^Te_n=e_m\cdot e_n$, and the outer product is $|n\rangle\langle m|=e_ne_m^T$, which is a $d\times d$ matrix with a single $1$ at the index $(n,m)$. Note that these notations are also used for arbitrary vectors; for example we might write $|\psi\rangle=v$ for some vector $v$, and then $\langle\psi|=v^T$, $\langle\psi|\psi\rangle=\|v\|^2$, and $|\psi\rangle\langle\psi|$ is the projection matrix in the direction of $v$.
Best Answer
If $\{|\epsilon_i\rangle\}$ is an orthonormal basis for the Hilbert space $\mathcal{H}$, then for arbitrary $|v\rangle \in \mathcal{H}$, we can write:
$$|v\rangle = \sum_i v_i |\epsilon_i\rangle \text{, where } v_i = \langle \epsilon_i| v\rangle.$$ The fact that $v_i = \langle \epsilon_i| v\rangle$ follows from orthonormality of the basis: $$ \langle\epsilon_i |v\rangle = \sum_j v_j \langle \epsilon_i|\epsilon_j\rangle = \sum_j v_j \delta_{ij} = v_i$$
Now, from this, we can prove the following useful fact: $$ 1 = \sum_i |\epsilon_i\rangle\langle\epsilon_i|.$$
The proof is as follows. Since $\{|\epsilon_i\rangle\}$ is an orthonormal basis, for arbitrary $|v\rangle \in \mathcal{H}$, we can write
$$ |v\rangle = \sum_i \langle \epsilon_i|v\rangle |\epsilon_i\rangle = \sum_i |\epsilon_i\rangle\langle \epsilon_i |v\rangle = \left(\sum_i |\epsilon_i\rangle\langle \epsilon_i |\right)|v\rangle,$$ meaning that $\sum_i |\epsilon_i\rangle\langle \epsilon_i|$ sends all $|v\rangle$ to $|v\rangle$.
Thus, in your expression, we are simply inserting $1$ into the inner product to write it as a sum over the basis elements: $$\langle v|w\rangle = \langle v|1|w\rangle = \langle v|\left(\sum_i|\epsilon_i\rangle\langle\epsilon_i| \right)|w\rangle = \sum_i \langle v|\epsilon_i\rangle\langle\epsilon_i|w\rangle = \sum_i v_i^* w_i.$$