I am guessing that your question is: What is $\bigwedge^k (V \oplus W)$? (This is the wedge product, not the tensor product.)
And also the same question for $S^k = Sym^k$, which is the symmetric product.
Extended hint:
The thing to observe is that there is a natural basis for $V \oplus W$. Namely, you take a basis for $V$ and union it with a basis for $W$.
Let's call our natural basis for $V$ $\{v_1, \ldots, v_n\}$, and our basis for $W$ $\{w_1, \ldots, w_m\}$.
Now, given a basis for a vector space $X$, there is a natural basis for $\bigwedge^k X$ and for $Sym^k X$:
Suppose that $\{x_1, \ldots, x_r\}$ is a basis for $X$.
1) Then a basis for $\bigwedge^k X$ is given by all of the $\{x_{i_1} \wedge \ldots \wedge x_{i_k} \}$ for all $i_1 < \ldots < i_k$, $i_j \in \{1, 2, \ldots r\}$.
2) A natural basis for $Sym$ is given similarly, only now you are allowed to take repeated vectors (so $<$ will be replaced by $\leq$) - another description is the set of monomials of degree k in $\mathbb{Z}[x_1, \ldots, x_r]$.
Finally: Given that $\{x_1, \ldots, x_r\} = \{v_1, \ldots, v_n, w_1, \ldots, w_m\}$ is a basis for $V \oplus W = X$, you can now play a combinatorial game to divide up $\bigwedge^k X$ into direct sums of smaller exterior powers of $V$ and $W$. Similarly for the symmetric product. Do you see how to proceed? Please feel free to ask if you have questions.
Let's assume everything has finite dimension here.
1) I don't know exactly to what property $(f\otimes g)(...) = f(...)g(...)$ refer to exactly in the second definition of tensor. For example if we have $f$ to be a (0,2) tensor and $g$ a (0,2) tensor. the product would be a (0,4) tensor so $(f\otimes g)(v_1,v_2,v_3,v_4) = f(v_1,v_2)g(v_3,v_4)$. To what kind of tensor in definition 2 would this be isomorphic ?
A tensor product of two vector spaces $V$ and $W$ is a pair $(\mathsf{T} ,t)$, where $\mathsf{T} $ is a vector space and $t\colon V \times W \to \mathsf{T} $ is bilinear, such that if $\{{\bf v}_i\}$ and $\{{\bf w}_j\}$ are bases for $V$ and $W$, then $\{t({\bf v}_i,{\bf w}_j)\}$ spans $\mathsf{T} $, and if given $b\colon V \times W \to Z$ is any bilinear map (arriving at an arbitrary vector space $Z$), there is an unique linear map $\overline{b}\colon \mathsf{T} \to Z$ such that $\overline{b}\circ t = b$. Meaning that bilinear maps $b$ factor through $\mathsf{T} $ and we have all the information needed in a single linear map $\overline{b}$. One then proves that all tensor products of $V$ and $W$ are isomorphic, and so we put the usual notations $\mathsf{T} \equiv V \otimes W$, $t \equiv \otimes$, and write ${\bf v} \otimes {\bf w}$ for $t({\bf v},{\bf w})$.
An explicit construction is to take the free vector space with basis $V \times W$, and take its quotient by the subspace spanned by the elements of the form \begin{align} &({\bf v}_1+{\bf v}_2,{\bf w}) - ({\bf v}_1,{\bf w})-({\bf v}_2,{\bf w}), \\ & ({\bf v},{\bf w}_1+{\bf w}_2)-({\bf v},{\bf w}_1)-({\bf v},{\bf w}_2), \\ & (\lambda{\bf v},{\bf w}) - ({\bf v},\lambda{\bf w}).\end{align}
We then denote the class of $({\bf v},{\bf w})$ by ${\bf v} \otimes {\bf w}$.
One generalizes all of this by considering more spaces, writing "multilinear maps" instead of "bilinear maps", and so on. The space $${\frak T}^{(r,s)}(V) = \{ f\colon (V^\ast)^r \times V^s \to \Bbb R \mid f \text{ is multilinear} \}$$is isomorphic to $V^{\otimes r}\otimes (V^\ast)^{\otimes s}$, and that isomorphism does not depend on a choice of basis (so it is better than your average run-of-the-mill isomorphism). Well, being more honest, we use a basis to define the isomorphism, but then we check that it would be same if we started with another basis. We say that $T \in {\frak T}^{(r,s)}(V)$ is a $r-$times contravariant and $s-$times covariant tensor. We of course have an operation $$\otimes \colon {\frak T}^{(r,s)}(V) \times {\frak T}^{(r',s')}(V) \to {\frak T}^{(r+r',s+s')}(V).$$
Now we hopefully understand a little better what a tensor product is, we can simply note that ${\frak T}^{(0,4)}(V) \cong (V^\ast)^{\otimes 4}$, and if $\{{\bf v}_i\}$ is a basis for $V$, and $\{{\bf v}^i\}$ is the dual basis, then $$f \otimes g = \sum_{i,j,k,\ell} f_{ijk\ell} {\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell,$$where $f_{ijkl} = f({\bf v}_i,{\bf v}_j,{\bf v}_k,{\bf v}_\ell)$ and ${\bf v}^i \otimes {\bf v}^j\otimes {\bf v}^k \otimes {\bf v}^\ell \in {\frak T}^{(0,4)}(V)$. It corresponds to that same expression seen as a linear combination of ${\bf v}^i \otimes {\bf v}^j \otimes {\bf v}^k \otimes {\bf v}^\ell$, the class of $({\bf v}^i,{\bf v}^j,{\bf v}^k,{\bf v}^\ell)$ in that quotient - that expression is an element of $(V^\ast)^{\otimes 4}$. Maybe I shouldn't have been lazy and used another notation for the classes until now - I'll gladly explain it all again if you have trouble following.
2) I am confused about what is a tensor component. As I understand tensor components are the scalars that form a linear combination of basis tensors. In definition 1, I see books defining the tensor components as $T^{a_1,...,a_k}_{b_1,...,b_k} = T(a_1,...,a_k,b_1,...,b_k)$. Where $a_i$ and $\{b_i\}$ are a basis of the vector space and covector space. For definition 2 I see tensor component written as :
\begin{align}
\sum\sum A_{ij}a_i\otimes b_j
\end{align}
How do these components relate ?
Components do depend on a choice of basis. The choice of notation used in your textbook was bad, we only keep indices on $T$, not the vectors. I mean one would write $$T^{i_1...i_r}_{\qquad j_1...j_s} \stackrel{\rm def.}{=} T({\bf v}^{i_1},...,{\bf v}^{i_r},{\bf v}_{j_1},...,{\bf v}_{j_s})$$instead. And with this notation, we'd have $$T = \sum_{i_1,...,i_r,j_1,...,j_s} T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}.$$With Einstein's summation convention, we'd only write $$T = T^{i_1...i_r}_{\qquad j_1...j_s} {\bf v}_{i_1}\otimes \cdots \otimes {\bf v}_{i_r}\otimes {\bf v}^{j_1}\otimes \cdots \otimes {\bf v}^{j_s}$$
with all the summations implied (and that's why index balance is good - if the same index appears twice, up and below, sum over it).
When you ask how these things relate, the reasonable thing to consider is another basis $\{{{\bf v}_i}'\}$, the corresponding dual basis $\{{{\bf v}^i}'\}$, write $${T^{i_1...i_r}_{\qquad j_1...j_s}}' = T({{\bf v}^{i_1}}',...,{{\bf v}^{i_r}}',{{\bf v}_{j_1}}',...,{{\bf v}_{j_s}}')$$and see how we can express this in terms of the "old" components $T^{i_1...i_r}_{\qquad j_1...j_s}$.
If you're still alive after all that index juggling, you'll certainly pardon me by ilustrating the relation only in the $(1,1)$ case. Write ${{\bf v}_j}' = \sum_i \alpha_{ij} v_i$. It is an easy linear algebra exercise to check that ${{\bf v}^j}' = \sum_i \beta_{ij} {\bf v}^i$, where $(\beta_{ij})$ is the inverse matrix of $(\alpha_{ij})$. Then $${T^i_{\hspace{1ex} j}}'=T({{\bf v}^i}',{{\bf v}_j}') = \sum_{k, \ell }\beta_{ki}\alpha_{\ell j}T({{\bf v}^k}',{{\bf v}^\ell}') = \sum_{k,\ell} \beta_{ki}\alpha_{\ell j}T^k_{\hspace{1ex}\ell}.$$
If we had more entries, then more $\alpha$'s and $\beta$'s would pop out. You can see that in any physics book, for instance. I like A Short Course in General Relativity, by Foster & Nightingale. By the way, it is costumary to denote the entries of the inverse matrix by writing indices upstairs. In this notation, and using Einstein's convention, we'd have simply $${T^i_{\hspace{1ex} j}}' = \alpha^{ki}\alpha_{\ell j}T^k_{\hspace{1ex} \ell}.$$
3) I was trying to work out a basic example of an endomorphism $\mathbb{R}^2 \longrightarrow \mathbb{R}^2$ by using the two definitions but couldn't end up with the same set of components...
When we write components in a basis, they're real numbers, and our discussion does not quite apply if the codomain of the bilinear map isn't $\Bbb R$.
Best Answer
What you write as $e^{\mu}(v_1)\otimes e^\nu(v_2)$ should really be $(e^{\mu}\otimes e^\nu)(v_1, v_2)=e^{\mu}(v_1)\, e^\nu(v_2),$ where the last expression is a product of two real values $e^{\mu}(v_1)$ and $e^\nu(v_2)$ and is therefore commutative, i.e. $e^{\mu}(v_1)\, e^\nu(v_2) = e^\nu(v_2) \, e^{\mu}(v_1).$