As a first comment, you are giving two distinct roles to $f$, which doesn't help (it is an endomorphism in your first definition, and a linear functional in the second one).
Second comment: I might be wrong, but I don't think there is an obvious connection between the two definitions. There are many ways of characterizing the trace, and the connection between them isn't obvious either. For example the trace of $f\in \operatorname{End}(V)$ is
- the sum of the diagonal entries of $f$ is some basis
- the sum of the diagonal entries of $f$ in any other basis
- the sum of the eigenvalues of $f$ (in the algebraic completion of $\mathbb F$)
- the only linear map $\varphi:\operatorname{End}(V)\to\mathbb F$ such that $\varphi(I)=n$ and $\varphi(f\circ g)=\varphi(g\circ f)$ for all $f,g\in\operatorname{End}(V)$.
It is not hard to prove that all these definitions define the same linear functional, but I wouldn't say it's obvious. And some of the implications require at least some theory, like for example to go to/from the eigenvalue definition you need something like the Jordan Form, Schur Diagonalization, or the Spectral Theorem.
In light of the above, I wouldn't expect an obvious abstract connection between your two definitions. Not saying that there isn't one, though.
The most direct way, for me, to see the connection between the two definitions is the following. It requires specifying a basis, as otherwise there is no way to express what the elements of $\bigwedge^nV^*$ are .
So fix a basis $e_1,\ldots,e_n$ of $V$, and consider the dual basis $e_1^*,\ldots,e_n^*$ of $V^*$. Then $\bigwedge^nV^*=\mathbb F\,e_1^*\land\cdots\land e_n^*$. It is easy to see that $\operatorname{End}(V)$ is the span of the rank-one operatos $\{f_{kj}\}$, where $f_{kj}(v)=e_j^*(v)\,e_k$. Now
\begin{align}
\sum_{s=1}^n(e_1^*(v_1)\land\cdots\land e_s^*(f_{kj}(v_s))\land\cdots\land e_n^*(v_n))
&=\sum_{s=1}^n\,e^*_s(e_k)\,(e_1^*(v_1)\land\cdots\land e_j^*(v)\land\cdots\land e_n^*(v_n))\\[0.3cm]
&=\delta_{kj}\,(e_1^*\land\cdots\land e_n^*)(v_1,\ldots,v_n).
\end{align}
So $\operatorname{tr}(f_{kj})=\delta_{kj}$. By linearity, for $f=\sum_{k,j}\beta_{kj}\,f_{kj}$, we get $\operatorname{tr}(f)=\sum_{k}f_{kk}$.
With the tensor approach, the $f_{kj}$ from above are $e_j^*\otimes e_k$, so
$$
\operatorname{tr}(e_j^*\otimes e_k)=e_j^*(e_k)=\delta_{kj}.
$$
Assume that $B$ is orthogonal, i.e. $B^tB = I_n$. Show that $\beta$ is positive definite if and only if $b_{ii} \geq 0$
for $i = 1,\ldots,n$
This is False. For example
$B:= \left[\begin{matrix}\frac{1}{3} & - \frac{2}{3} & - \frac{2}{3}\\- \frac{2}{3} & \frac{1}{3} & - \frac{2}{3}\\- \frac{2}{3} & - \frac{2}{3} & \frac{1}{3}\end{matrix}\right]= I - \frac{2}{3}\mathbf {11}^T$
$v:= v_1+v_2+v_3\implies -3= \mathbf 1^T B \mathbf 1 = \langle v, v\rangle $
Best Answer
The issue in the case $2=0$ is that several different symmertric forms have the same quadratic form. In $K^n$ for example every symmetric form has the representation $\phi(x,y) = \langle Ax,y\rangle$ with a symmetric matrix $A$. Now, $\langle Ax,x\rangle = \sum_i\sum_jx_ix_j\langle Ae_i,e_j\rangle = \sum_ix_i^2\langle Ae_i,e_i\rangle$. Hence, all symmetric matrices with the same diagonal generate the same quadratic form.
In the case $2=0$ the conditions aren't equivalent. As a counterexample, let $n=2$, and $q(x) = f(x)g(x)$, where $f(x) = x_2$ and $g(x) = x_1$. Thus, $q(x) = x_1x_2$. But as you saw above, we must have $q(x) = a_1x_1^2+a_2x_2^2$. Plugging in $x=e_k$ yields $a_k=0$ for $k=1,2$ and hence $q=0$, a contradiction.
In the case $2=0$, what you need to add to the condition in (2) is a certain symmetry condition, namely that $\sum_i f_i(x)g_i(y) = \sum_i f_i(y)g_i(x)$ (which is easily seen to be satisfied if $q$ is indeed a quadratic form). Then $q$ is indeed a quadratic form. To see this, we assume WLOG that $V = K^n$. Then $f_i(x) = \sum_jf_{ij}x_j$ and $g_i(x) = \sum_jg_{ij}x_j$. The symmetry condition means that $\sum_if_{ij}g_{ik} = \sum_if_{ik}g_{ij}$ for $j\neq k$. Set $a_{jk}:=\sum_if_{ij}g_{ik}$. Then $A = (a_{jk})$ is symmetric and \begin{align} q(x) &= \sum_if_i(x)g_i(x) = \sum_i\sum_j\sum_k f_{ij}g_{ik}x_jx_k = \sum_{j,k}a_{jk}x_kx_j = \langle Ax,x\rangle. \end{align} Hence, $q$ is a quadratic form.
Old answer: Your idea for the proof from (1) to (2) to start with a basis of $V$ is fine. But then it's not clear how you go on. If $v = \sum x_iv_i$ and $w = \sum y_iv_i$, then $\phi(v,w) = \sum_i\sum_j\phi(v_i,v_j)x_iy_j = \langle Ax,y\rangle$, where $A$ is the symmetric matrix with entries $\phi(v_i,v_j)$. Now, $$ Ax = A\sum_i\langle x,e_i\rangle e_i = \sum_i \langle x,e_i\rangle z_i, $$ where $z_i := Ae_i$. Hence, $$ \phi(v,w) = \langle Ax,y\rangle = \sum_i \langle x,e_i\rangle\langle z_i,y\rangle. $$ Now, set $f_i(v) = \langle x,e_i\rangle$ and $g_i(w):=\langle z_i,y\rangle$. Then $\phi(v,w) = \sum_if_i(v)g_i(w)$, and, in particular, $\phi(v,v) = \sum_if_i(v)g_i(v)$.