One way to think about what a tensor does is ask how many arguments it has to accept before it returns a scalar. What you have written above shows a function which is linear in $k$ arguments, and when you feed it $k$ vectors, you get back a number. Often times, it is convenient to distinguish between rows and columns as vectors, for example, if you are not in the presence of a natural inner product. In this case, instead of simply speaking about $k$-tensors, we can speak of $(i,j)$ tensors, where the tensor accepts $i$ rows and $j$ columns as argument.
A matrix can be regarded as a $2$-tensor, or more specifically a $(1,1)$ tensor. If $M$ is the matrix, and $v,w$ are vectors, then the matrix accepts two arguments. You write $v^TMw$, and this evaluates to a number.
Similarly, if $v$ is a vector, then this is a $(1,0)$ tensor, since multiplying on the left by a row vector gets you a number.
If you are in the presence of an inner product, the process of applying the inner product to turn vectors into covectors or conversely is called lowering/raising of indices.
First note that if $k=1$, $\pi$ is the identity map and $\ker \pi = 0$.
So, suppose from here that $k>1$.
Definition. A tensor $f \in \mathcal{L}^k(V)$ is redundant if $f = \phi_1 \otimes \cdots \otimes \phi_k$ for $\phi_1,\dots,\phi_k \in \mathcal{L}^1(V)$ and $\phi_i = \phi_{i+1}$ for some $1 \leq i < k$.
Observe that if $e_i$ is the elementary permutation that exchanges $i$ and $i+1$, then $e_if = f$ and, since $\pi(e_i f) = \text{sgn}(e_i) \pi(f) = -\pi(f)$, so $\pi(f)=0$.
Definition. Let $\mathcal{I}^k(V)$ be the spanned subspace of $\mathcal{L}^k(V)$ by all redundant $k$-tensors.
Hence, $\ker \pi$ contains $\mathcal{I}^k(V)$.
We will prove that they are indeed equal.
Lemma. Given $f \in \mathcal{L}^k(V)$, for any $\sigma \in S_k$ there exists $g_\sigma \in \mathcal{I}^k(V)$ such that
$$
\sigma f = \text{sgn}(\sigma)f+g_\sigma.
$$
Proof: Since $\sigma$ can be written as a product of elementary permutations, we will prove the statement by induction on the number of factors.
- Suppose $\sigma=e_i$ for some $i$, and assume without loss of generality that $f$ equals $\phi_1 \otimes \cdots \otimes \phi_k$ for $\phi_1,\dots,\phi_k \in \mathcal{L}^1(V)$.
Then $$
\sigma f-\text{sgn}(\sigma)f = f_1 \otimes (\phi_i \otimes \phi_{i+1} + \phi_{i+1} \otimes \phi_i) \otimes f_2,
$$
where $f_1 = \phi_1 \otimes \cdots \otimes \phi_{i-2}$ and $f_2 = \phi_{i+2} \otimes \cdots \otimes \phi_k$.
Notice that the tensor $\phi_i \otimes \phi_{i+1} + \phi_{i+1} \otimes \phi_i$ belongs to $\mathcal{I}^2(V)$ since it can be written as $$
\tfrac12 [(\phi_i + \phi_{i+1}) \otimes (\phi_i + \phi_{i+1}) - \phi_i \otimes \phi_i - \phi_{i+1} \otimes \phi_{i+1}].
$$
It follows that $\sigma f-\text{sgn}(\sigma)f \in \mathcal{I}^k(V)$.
- For the inductive step, write $\sigma$ as $e \circ \tau$, where $e$ is an elementary permutation and $\tau$ is a product containing at least one such permutation.
Then, by the previous case and the hypothesis, $$
g_1 := e(\tau f) - \text{sgn}(e)(\tau f) \in \mathcal{I}^k(V), \\
g_2 := \tau f - \text{sgn}(\tau)f \in \mathcal{I}^k(V).
$$
Hence \begin{align}
\sigma f = e(\tau f) &= \text{sgn}(e)(\tau f) + g_1 \\
&= \text{sgn}(e)(\text{sgn}(\tau)f + g_2) + g_1 \\
&= \text{sgn}(\sigma)f + (\text{sgn}(e)g_2+g_1).
\end{align}
Proposition. For any $f \in \mathcal{L}^k(V)$ there exists $g \in \mathcal{I}^k(V)$ such that
$$
\pi(f) = k!f+g.
$$
Proof. Indeed:
\begin{align}
\pi(f) &= \sum_{\sigma \in S_k} \text{sgn}(\sigma)(\sigma f) \\
&= \sum_{\sigma \in S_k} (f+\text{sgn}(\sigma)g_\sigma) = k!f + \sum_{\sigma \in S_k} \text{sgn}(\sigma)g_\sigma.
\end{align}
Corollary. The kernel of $\pi$ is $\mathcal{I}^k(V)$.
Proof: If $f \in \ker \pi$ and $g \in \mathcal{I}^k(V)$ is such that $(0=) \pi(f) = k!f+g$, then
$$
f = -\frac1{k!}g \in \mathcal{I}^k(V).
$$
Reference:
Victor Guillemin and Peter J. Haine, Differential Forms. Section 1.5 (page 17).
Best Answer
$(f^\sigma)^\tau(v_1,\ldots,v_k)$ is $f^\sigma(v_{\tau(1)},\ldots,v_{\tau(k)})$, not $f^\tau(v_{\sigma(1)},\ldots,v_{\sigma(k)})$.
If we put $g=f^\sigma$ and $u_i=v_{\tau(i)}$, we should have \begin{aligned} (f^\sigma)^\tau(v_1,\ldots,v_k) &=g^\tau(v_1,\ldots,v_k)\\ &=g(v_{\tau(1)},\ldots,v_{\tau(k)})\\ &=\color{red}{f^\sigma(u_1,\ldots,u_k)}\\ &=f(u_{\sigma(1)},\ldots,u_{\sigma(k)})\\ &=f(v_{\tau(\sigma(1))},\ldots,v_{\tau(\sigma(k))})\\ &=f^{\tau\circ\sigma}(v_1,\ldots,v_k). \end{aligned}