Let's assume everything has finite dimension here.
1) I don't know exactly to what property $(f\otimes g)(...) = f(...)g(...)$ refer to exactly in the second definition of tensor. For example if we have $f$ to be a (0,2) tensor and $g$ a (0,2) tensor. the product would be a (0,4) tensor so $(f\otimes g)(v_1,v_2,v_3,v_4) = f(v_1,v_2)g(v_3,v_4)$. To what kind of tensor in definition 2 would this be isomorphic ?
A tensor product of two vector spaces $V$ and $W$ is a pair $(\mathsf{T} ,t)$, where $\mathsf{T} $ is a vector space and $t\colon V \times W \to \mathsf{T} $ is bilinear, such that if $\{{\bf v}_i\}$ and $\{{\bf w}_j\}$ are bases for $V$ and $W$, then $\{t({\bf v}_i,{\bf w}_j)\}$ spans $\mathsf{T} $, and if given $b\colon V \times W \to Z$ is any bilinear map (arriving at an arbitrary vector space $Z$), there is an unique linear map $\overline{b}\colon \mathsf{T} \to Z$ such that $\overline{b}\circ t = b$. Meaning that bilinear maps $b$ factor through $\mathsf{T} $ and we have all the information needed in a single linear map $\overline{b}$. One then proves that all tensor products of $V$ and $W$ are isomorphic, and so we put the usual notations $\mathsf{T} \equiv V \otimes W$, $t \equiv \otimes$, and write ${\bf v} \otimes {\bf w}$ for $t({\bf v},{\bf w})$.
An explicit construction is to take the free vector space with basis $V \times W$, and take its quotient by the subspace spanned by the elements of the form \begin{align} &({\bf v}_1+{\bf v}_2,{\bf w}) - ({\bf v}_1,{\bf w})-({\bf v}_2,{\bf w}), \\ & ({\bf v},{\bf w}_1+{\bf w}_2)-({\bf v},{\bf w}_1)-({\bf v},{\bf w}_2), \\ & (\lambda{\bf v},{\bf w}) - ({\bf v},\lambda{\bf w}).\end{align}
We then denote the class of $({\bf v},{\bf w})$ by ${\bf v} \otimes {\bf w}$.
One generalizes all of this by considering more spaces, writing "multilinear maps" instead of "bilinear maps", and so on. The space $${\frak T}^{(r,s)}(V) = \{ f\colon (V^\ast)^r \times V^s \to \Bbb R \mid f \text{ is multilinear} \}$$is isomorphic to $V^{\otimes r}\otimes (V^\ast)^{\otimes s}$, and that isomorphism does not depend on a choice of basis (so it is better than your average run-of-the-mill isomorphism). Well, being more honest, we use a basis to define the isomorphism, but then we check that it would be same if we started with another basis. We say that $T \in {\frak T}^{(r,s)}(V)$ is a $r-$times contravariant and $s-$times covariant tensor. We of course have an operation $$\otimes \colon {\frak T}^{(r,s)}(V) \times {\frak T}^{(r',s')}(V) \to {\frak T}^{(r+r',s+s')}(V).$$
Now we hopefully understand a little better what a tensor product is, we can simply note that ${\frak T}^{(0,4)}(V) \cong (V^\ast)^{\otimes 4}$, and if $\{{\bf v}_i\}$ is a basis for $V$, and $\{{\bf v}^i\}$ is the dual basis, then $$f \otimes g = \sum_{i,j,k,\ell} f_{ijk\ell} {\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell,$$where $f_{ijkl} = f({\bf v}_i,{\bf v}_j,{\bf v}_k,{\bf v}_\ell)$ and ${\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell \in {\frak T}^{(0,4)}(V)$. It corresponds to that same expression seen as a linear combination of ${\bf v}^i \otimes {\bf v}^j \otimes {\bf v}^k \otimes {\bf v}^\ell$, the class of $({\bf v}^i,{\bf v}^j,{\bf v}^k,{\bf v}^\ell)$ in that quotient - that expression is an element of $(V^\ast)^{\otimes 4}$. Maybe I shouldn't have been lazy and used another notation for the classes until now - I'll gladly explain it all again if you have trouble following.
2) I am confused about what is a tensor component. As I understand tensor components are the scalars that form a linear combination of basis tensors. In definition 1, I see books defining the tensor components as $T^{a_1,...,a_k}_{b_1,...,b_k} = T(a_1,...,a_k,b_1,...,b_k)$. Where $a_i$ and $\{b_i\}$ are a basis of the vector space and covector space. For definition 2 I see tensor component written as :
\begin{align}
\sum\sum A_{ij}a_i\otimes b_j
\end{align}
How do these components relate ?
Components do depend on a choice of basis. The choice of notation used in your textbook was bad, we only keep indices on $T$, not the vectors. I mean one would write $$T^{i_1...i_r}_{\qquad j_1...j_s} \stackrel{\rm def.}{=} T({\bf v}^{i_1},...,{\bf v}^{i_r},{\bf v}_{j_1},...,{\bf v}_{j_s})$$instead. And with this notation, we'd have $$T = \sum_{i_1,...,i_r,j_1,...,j_s} T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}.$$With Einstein's summation convention, we'd only write $$T = T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}$$
with all the summations implied (and that's why index balance is good - if the same index appears twice, up and below, sum over it).
When you ask how these things relate, the reasonable thing to consider is another basis $\{{{\bf v}_i}'\}$, the corresponding dual basis $\{{{\bf v}^i}'\}$, write $${T^{i_1...i_r}_{\qquad j_1...j_s}}' = T({{\bf v}^{i_1}}',...,{{\bf v}^{i_r}}',{{\bf v}_{j_1}}',...,{{\bf v}_{j_s}}')$$and see how we can express this in terms of the "old" components $T^{i_1...i_r}_{\qquad j_1...j_s}$.
If you're still alive after all that index juggling, you'll certainly pardon me by ilustrating the relation only in the $(1,1)$ case. Write ${{\bf v}_j}' = \sum_i \alpha_{ij} v_i$. It is an easy linear algebra exercise to check that ${{\bf v}^j}' = \sum_i \beta_{ij} {\bf v}^i$, where $(\beta_{ij})$ is the inverse matrix of $(\alpha_{ij})$. Then $${T^i_{\hspace{1ex} j}}'=T({{\bf v}^i}',{{\bf v}_j}') = \sum_{k, \ell }\beta_{ki}\alpha_{\ell j}T({{\bf v}^k}',{{\bf v}^\ell}') = \sum_{k,\ell} \beta_{ki}\alpha_{\ell j}T^k_{\hspace{1ex}\ell}.$$
If we had more entries, then more $\alpha$'s and $\beta$'s would pop out. You can see that in any physics book, for instance. I like A Short Course in General Relativity, by Foster & Nightingale. By the way, it is costumary to denote the entries of the inverse matrix by writing indices upstairs. In this notation, and using Einstein's convention, we'd have simply $${T^i_{\hspace{1ex} j}}' = \alpha^{ki}\alpha_{\ell j}T^k_{\hspace{1ex} \ell}.$$
3) I was trying to work out a basic example of an endomorphism $\mathbb{R}^2 \longrightarrow \mathbb{R}^2$ by using the two definitions but couldn't end up with the same set of components...
When we write components in a basis, they're real numbers, and our discussion does not quite apply if the codomain of the bilinear map isn't $\Bbb R$.
Best Answer
I think there is some confusion about the distinction between second order tensors and linear maps. On the one hand, your definition of $A\otimes B$ seems to imply that $A=A_{ij}e_i\otimes e_j$, which would imply that $A\in V\otimes V$, $V$ being the vector space in which the vectors live. On the other hand, in your definition of $A\boxtimes B$ you seem to treat $A$ as a linear mapping, in which $A$ would instead be an element of $V\otimes V^*\approx Hom(V,V)$. Of course if $V$ is finite dimensional, then there is an isomorphism between $V$ and $V^*$ anyway, but it can make things conceptually clearer to distinguish vectors from dual vectors.
With that said, let us consider $A$ and $B$ to be linear mappings $V\to V$, in which case $A\boxtimes B$ is an element of $Hom(V\otimes V, V\otimes V)$ and we have a sequence of isomorphisms:
\begin{eqnarray*} Hom(V\otimes V, V\otimes V) &\approx & (V\otimes V)\otimes (V\otimes V)^*\\ &\approx & (V\otimes V)\otimes (V^*\otimes V^*)\\ &\approx & (V\otimes V^*)\otimes (V\otimes V^*)\\ &\approx Hom(V,V)\otimes Hom(V,V)\\ \end{eqnarray*} and it is straightofrward to see that the image of $A\boxtimes B$ under this sequence is isomorphisms is precisely $A\otimes B\in Hom(V,V)\otimes Hom(V,V)$.
To see this more explicitly, let $e_i$ denote a basis of $V$, with $e_i^*$ denoting the dual basis. Then the element of $V\otimes V^*$ corresponding to $A$ is given by $A_{ij} e_i\otimes e_j^*$. Note that here and in the rest of this answer, we will omit the explicit summation symbol over repeated indices.
On the other hand, we have
$$(A\boxtimes B) (e_k\otimes e_l)=Ae_k\otimes Be_l=A_{ik}e_i\otimes B_{jl}e_j=A_{ik}B_{jl} e_i\otimes e_j$$
Therefore we can trace through the image of $A\boxtimes B$ under the sequence of isomorphisms above as follows:
\begin{eqnarray*} A\boxtimes B&\mapsto &A_{ik}B_{jl}(e_i\otimes e_j)\otimes(e_k\otimes e_l)^*\\ &\mapsto &A_{ik}B_{jl}(e_i\otimes e_j)\otimes(e_k^*\otimes e_l^*)\\ &\mapsto & A_{ik} (e_i\otimes e_k^*)\otimes B_{jl} (e_j\otimes e_l^*)\\ &\mapsto & A\otimes B \end{eqnarray*}