Let's assume everything has finite dimension here.
1) I don't know exactly to what property $(f\otimes g)(...) = f(...)g(...)$ refer to exactly in the second definition of tensor. For example if we have $f$ to be a (0,2) tensor and $g$ a (0,2) tensor. the product would be a (0,4) tensor so $(f\otimes g)(v_1,v_2,v_3,v_4) = f(v_1,v_2)g(v_3,v_4)$. To what kind of tensor in definition 2 would this be isomorphic ?
A tensor product of two vector spaces $V$ and $W$ is a pair $(\mathsf{T} ,t)$, where $\mathsf{T} $ is a vector space and $t\colon V \times W \to \mathsf{T} $ is bilinear, such that if $\{{\bf v}_i\}$ and $\{{\bf w}_j\}$ are bases for $V$ and $W$, then $\{t({\bf v}_i,{\bf w}_j)\}$ spans $\mathsf{T} $, and if given $b\colon V \times W \to Z$ is any bilinear map (arriving at an arbitrary vector space $Z$), there is an unique linear map $\overline{b}\colon \mathsf{T} \to Z$ such that $\overline{b}\circ t = b$. Meaning that bilinear maps $b$ factor through $\mathsf{T} $ and we have all the information needed in a single linear map $\overline{b}$. One then proves that all tensor products of $V$ and $W$ are isomorphic, and so we put the usual notations $\mathsf{T} \equiv V \otimes W$, $t \equiv \otimes$, and write ${\bf v} \otimes {\bf w}$ for $t({\bf v},{\bf w})$.
An explicit construction is to take the free vector space with basis $V \times W$, and take its quotient by the subspace spanned by the elements of the form \begin{align} &({\bf v}_1+{\bf v}_2,{\bf w}) - ({\bf v}_1,{\bf w})-({\bf v}_2,{\bf w}), \\ & ({\bf v},{\bf w}_1+{\bf w}_2)-({\bf v},{\bf w}_1)-({\bf v},{\bf w}_2), \\ & (\lambda{\bf v},{\bf w}) - ({\bf v},\lambda{\bf w}).\end{align}
We then denote the class of $({\bf v},{\bf w})$ by ${\bf v} \otimes {\bf w}$.
One generalizes all of this by considering more spaces, writing "multilinear maps" instead of "bilinear maps", and so on. The space $${\frak T}^{(r,s)}(V) = \{ f\colon (V^\ast)^r \times V^s \to \Bbb R \mid f \text{ is multilinear} \}$$is isomorphic to $V^{\otimes r}\otimes (V^\ast)^{\otimes s}$, and that isomorphism does not depend on a choice of basis (so it is better than your average run-of-the-mill isomorphism). Well, being more honest, we use a basis to define the isomorphism, but then we check that it would be same if we started with another basis. We say that $T \in {\frak T}^{(r,s)}(V)$ is a $r-$times contravariant and $s-$times covariant tensor. We of course have an operation $$\otimes \colon {\frak T}^{(r,s)}(V) \times {\frak T}^{(r',s')}(V) \to {\frak T}^{(r+r',s+s')}(V).$$
Now we hopefully understand a little better what a tensor product is, we can simply note that ${\frak T}^{(0,4)}(V) \cong (V^\ast)^{\otimes 4}$, and if $\{{\bf v}_i\}$ is a basis for $V$, and $\{{\bf v}^i\}$ is the dual basis, then $$f \otimes g = \sum_{i,j,k,\ell} f_{ijk\ell} {\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell,$$where $f_{ijkl} = f({\bf v}_i,{\bf v}_j,{\bf v}_k,{\bf v}_\ell)$ and ${\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell \in {\frak T}^{(0,4)}(V)$. It corresponds to that same expression seen as a linear combination of ${\bf v}^i \otimes {\bf v}^j \otimes {\bf v}^k \otimes {\bf v}^\ell$, the class of $({\bf v}^i,{\bf v}^j,{\bf v}^k,{\bf v}^\ell)$ in that quotient - that expression is an element of $(V^\ast)^{\otimes 4}$. Maybe I shouldn't have been lazy and used another notation for the classes until now - I'll gladly explain it all again if you have trouble following.
2) I am confused about what is a tensor component. As I understand tensor components are the scalars that form a linear combination of basis tensors. In definition 1, I see books defining the tensor components as $T^{a_1,...,a_k}_{b_1,...,b_k} = T(a_1,...,a_k,b_1,...,b_k)$. Where $a_i$ and $\{b_i\}$ are a basis of the vector space and covector space. For definition 2 I see tensor component written as :
\begin{align}
\sum\sum A_{ij}a_i\otimes b_j
\end{align}
How do these components relate ?
Components do depend on a choice of basis. The choice of notation used in your textbook was bad, we only keep indices on $T$, not the vectors. I mean one would write $$T^{i_1...i_r}_{\qquad j_1...j_s} \stackrel{\rm def.}{=} T({\bf v}^{i_1},...,{\bf v}^{i_r},{\bf v}_{j_1},...,{\bf v}_{j_s})$$instead. And with this notation, we'd have $$T = \sum_{i_1,...,i_r,j_1,...,j_s} T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}.$$With Einstein's summation convention, we'd only write $$T = T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}$$
with all the summations implied (and that's why index balance is good - if the same index appears twice, up and below, sum over it).
When you ask how these things relate, the reasonable thing to consider is another basis $\{{{\bf v}_i}'\}$, the corresponding dual basis $\{{{\bf v}^i}'\}$, write $${T^{i_1...i_r}_{\qquad j_1...j_s}}' = T({{\bf v}^{i_1}}',...,{{\bf v}^{i_r}}',{{\bf v}_{j_1}}',...,{{\bf v}_{j_s}}')$$and see how we can express this in terms of the "old" components $T^{i_1...i_r}_{\qquad j_1...j_s}$.
If you're still alive after all that index juggling, you'll certainly pardon me by ilustrating the relation only in the $(1,1)$ case. Write ${{\bf v}_j}' = \sum_i \alpha_{ij} v_i$. It is an easy linear algebra exercise to check that ${{\bf v}^j}' = \sum_i \beta_{ij} {\bf v}^i$, where $(\beta_{ij})$ is the inverse matrix of $(\alpha_{ij})$. Then $${T^i_{\hspace{1ex} j}}'=T({{\bf v}^i}',{{\bf v}_j}') = \sum_{k, \ell }\beta_{ki}\alpha_{\ell j}T({{\bf v}^k}',{{\bf v}^\ell}') = \sum_{k,\ell} \beta_{ki}\alpha_{\ell j}T^k_{\hspace{1ex}\ell}.$$
If we had more entries, then more $\alpha$'s and $\beta$'s would pop out. You can see that in any physics book, for instance. I like A Short Course in General Relativity, by Foster & Nightingale. By the way, it is costumary to denote the entries of the inverse matrix by writing indices upstairs. In this notation, and using Einstein's convention, we'd have simply $${T^i_{\hspace{1ex} j}}' = \alpha^{ki}\alpha_{\ell j}T^k_{\hspace{1ex} \ell}.$$
3) I was trying to work out a basic example of an endomorphism $\mathbb{R}^2 \longrightarrow \mathbb{R}^2$ by using the two definitions but couldn't end up with the same set of components...
When we write components in a basis, they're real numbers, and our discussion does not quite apply if the codomain of the bilinear map isn't $\Bbb R$.
Note that for $F \in L(V_1, \dots, V_k; W)$, we have $F_{i_1,\ldots,i_k} = F(\mathbf{e}_{i_1}^{(1)},\ldots,\mathbf{e}_{i_k}^{(k)}) \in W$. Let $\mathbf{w}_1, \dots, \mathbf{w}_m$ be a basis for $W$, then $F_{i_1,\dots, i_k} = F_{i_1,\dots, i_k}^l\mathbf{w}_l$ where $F_{i_1,\dots, i_k}^l \in \mathbb{R}$ for $l = 1, \dots, m$. So now
$$F = F_{i_1,\ldots,i_k}^l\mathbf{e}_{(1)}^{i_1}\otimes\cdots\otimes\mathbf{e}_{(k)}^{i_k}\otimes\mathbf{w}_l.$$
Best Answer
$\otimes$, also called tensing, is something you get bundled with the tensor product that you don't have in an ordinary vector space. How the tensor product vector space and tensing work together are what the real "meat" behind the tensor product is. Constructions are not "the real meaning", because there are an infinite number of them that will do the job - they're really better understood as first, proofs that the tensor product exists, and second, encodings of the tensor product in the medium of sets, similar to how that, on a computer, ASCII is an encoding of text in binary numbers. The same applies to constructions of most other mathematical objects using sets.
Hence, what $v \otimes w$ "is" will depend on which construction you choose. In the first case, it is not circular: we define $v \otimes w$ to be the cell in $Q$ containing the ordered pair $(v, w)$. And in general cases, that is the
The "real meaning" behind the tensor product, and that nifty little tensing operation in comes with, is that it provides a space which lets you work with bilinear maps (generically, $n$-linear maps) as though they were unilinear maps. Now, I suppose you (or some others) might be thinking, "but isn't $V \times W$ a vector space? So isn't a bilinear map $f: V \times W \rightarrow Z$, a linear map from an ordered pair $(v, w)$, viewed as a single vector in $V \times W$?" Yes, it is, but remember that a bilinear map must be linear in each argument individually, and this gives them more structure that is not captured by a simple linear map out of $V \times W$.
Hence the tensor product. We can think of this as enriching the domain so that, in this new domain, which we call $V \otimes W$, being unilinear now carries all the structural weight of being bilinear on the $V \times W$ domain.
In particular, the tensor product as the property that every bilinear map $f: V \times W \rightarrow Z$, can be understood uniquely as a unilinear map $f_\otimes : V \otimes W \rightarrow Z$, where
$$f_\otimes(v \otimes w) := f(v, w).$$
Moreover, every vector space that has this property is isomorphic to the tensor product. The construction, then, simply shows that this is not a vacuous statement, i.e. that we are actually talking about a real mathematical object here. In this regard, it's kind of like the various constructions of the real numbers: the real numbers are "really" the single object known as "the Dedekind-complete ordered field" - what those constructions do is they prove that such a thing actually exists.
In this setting, the meaning of $v \otimes w$ is that it's a "package" that wraps together $v$ and $w$ into a single matrovector for processing into a linear map in such a fashion that said linear maps acquire all the extra structure bilinear maps have, which simply taking an ordered pair would not be able to do.