I would like to give some details in order to make clear that one can give a proof with hardly any computations at all (I have never looked at the Bourbaki presentation but I guess they make the same point though, because they want to make all proofs only depend on previous material they might make a few more computations).
To begin with we are considering supercommutative algebras (over a commutative base ring $R$) which are strictly commutative, i.e., $\mathbb Z/2$-graded algebras with $xy=(-1)^{mn}yx$, where $m$ and $n$ are the degrees of $x$ and $y$ and $x^2=0$ if $x$ has odd degree. Note that the base extension of such an algebra is of the same type (the most computational part of such a verification is for the strictness which uses that on odd elements $x^2$ is a quadratic form with trivial associated bilinear form). The exterior algebra is then the free strictly supercommutative algebra on the module $V$. Furthermore, the (graded) tensor product of two sca's is an sca and from that it follows formally that $\Lambda^\ast(U\bigoplus V)=\Lambda^\ast U\bigotimes\Lambda^\ast V$.
Now, the diagonal map $U\to U\bigoplus U$ then induces a coproduct on $\Lambda^\ast U$ which by functoriality is cocommutative making the exterior algebra into superbialgebra. I also want to make the remark that if $U$ is f.g. projective then so is $\Lambda^\ast U $. Indeed, by presenting $U$ as a direct factor ina free f.g. module one reduces to the case when $U$ is free and then by the fact that direct sums are taken to tensor products to the case when $U=R$ but in that case it is clear that $\Lambda^\ast U=R\bigoplus R\delta$ where $\delta$ is an
odd element of square $0$.
If now $U$ is still f.g. projective then $(\Lambda^\ast U^\ast)^\ast$ is also a supercommutative and supercocommutative superbialgebra. We need to know that it is strictly supercommutative. This can be done by a calculation but we want to get by with as much handwaving as possible. We can again reduce to the case when $U$ is free. After that we can reduce to the case when $U=R$ (using that the tensor product of sca's is an sca) and there it is clear (and one can even in that case avoid computations). Another way, once we are then free case, is to use the base change property to reduce to the when $R=\mathbb Z$ and then we have seen that the exterior algebra is torsion free and a torsion free supercommutative algebra is an sca.
We also need to know that if $U$ is f.g. projective, then the structure map $U\to\Lambda^\ast U$ has a canonical splitting. This is most easily seen by introducing a $\mathbb Z$-grading extending the $\mathbb Z/2$-grading and showing that the degree $1$ part of it is exactly $U$ (maybe better would have to work from the start with $\mathbb Z$-gradings).
This splitting gives us a map $U\to(\Lambda^\ast U^\ast)^\ast$ into the odd part and as we have an sca we get a map $\Lambda^\ast U\to(\Lambda^\ast U^\ast)^\ast$ of sca's. This map is an isomorphism. Indeed, we first reduce to the case when $R$ is free (this is a little bit tricky if one wants to present $U$ as a direct factor of free but one may also use base change compatibility to reduce to the case when $R$ is local in which case $U$ is always free). Then as before one reduces to $U=R$ where itt is clear as both sides are concentrated in degrees $0$ and $1$ and the map clearly is an iso in degree $0$ and by construction in degree $1$.
Note that everything works exactly the same if one works with ordinary commutativity (so that the exterior algebra is replaced by the symmetric one) up till the verification in the case when $U=R$. In that case the dual algebra is the divided power algebra and we only have an isomorphism in characteristic $0$.
Afterthought: The fact that things work for the exterior algebra but introduces divided powers in the symmetric case has an intriguing twist. One can define divided power superalgebras where all higher powers of odd elements are defined to be zero. With that definition the exterior algebra has a canonical divided power structure (for a projective module) which is characterised by being natural and commuting with base change (or being compatible with tensor products, just as for ordinary divided powers there is a natural divided power on the tensor product of two divided power algebras). Hence, somehow the fact that the exterior algebra is selfdual is connected with the fact that the exterior algebra is also the free divided power superalgebra on odd elements. However, I do not know of any a priori reason why the dual algebra, both in the symmetric and exterior case, should have a divided power structure.
As a curious aside, the divided power structure on the exterior algebra is related to the Riemann-Roch formula for abelian varieties: If $L$ is a line bundle on an abelian variety the RR formula says that
$$
\chi(L) = \frac{L^g}{g!}
$$
and this gives (as usual for Aityah-Singer index formulas) also the integrality statement that $L^g$ is divisible by $g!$. However, that follows directly from the fact that the cohomology of an abelian variety is the exterior algebra on $H^1$ and the fact that the exterior algebra has a divided power structure.
Let's just view the function $\dot{X}(t) = \frac{d}{dt} X_t$ as a bounded function taking values in the vector space $V$. The notation $dX_{u_1}$ means $\dot{X}(u_1) du_1$ Then $dX_{u_1} \otimes \cdots \otimes dX_{u_k} = \dot{X}(u_1) \otimes \cdots \otimes \dot{X}(u_k) du_1 \cdots du_k$, should be regarded as a bounded function (or measure) on $[0,t]^k$ taking values in the vector space $V^{\otimes k}$.
Because the operation of projecting to a symmetric part is a linear operation from $V^{\otimes k}$ to itself, you can take it inside of the integral. Let's call this symmetric part $dX_{u_1} \cdots dX_{u_k}$. I'll use the symbol $u \cdot v$ to denote the symmetric product of of $u, v \in V$ -- that is, $u \cdot u \cdot u = u \otimes u \otimes u$, and the general product is defined by the polarization identity. For example $u \cdot v = ( u \otimes v + v \otimes u) / 2$. The resulting multiplication is commutative.
Therefore your integral is equal to
$\int_{0 \leq u_1 \leq \ldots \leq u_k \leq t} dX_{u_1} \cdots dX_{u_k}$
Because the symmetric product is commutative, for any permutation $\sigma$ of $\{ 1 \ldots k\}$ you get the same value by integrating any of the permuted regions
$ \int_{0 \leq u_1 \leq \ldots \leq u_k \leq t} dX_{u_1} \cdots dX_{u_k} = \int_{0 \leq u_{\sigma(1)} \leq \ldots \leq u_{\sigma(k)} \leq t} dX_{u_1} \cdots dX_{u_k}$
Observe also that the region $0 \leq u_1 \leq \ldots \leq u_k \leq t$ is a fundamental domain for the action of $S_k$ on the cube $[0,t]$ -- that is, you can get the whole cube by permuting the variables $u_i$, and none of these regions overlap. Therefore, since there are $k!$ such permutations, we sum over permutations to conclude
$ k! \int_{0 \leq u_1 \leq \ldots \leq u_k \leq t} dX_{u_1} \cdots dX_{u_k} = \int_{[0,t]^k} dX_{u_1} \cdots dX_{u_k}$
But, by Fubini and the multilinearity of the symmetric product, the integral on the right is just $(X_t - X_0)^k$.
I should remark that this same proof in an even simpler setting also produces the $\frac{1}{k!}$ that appears when you prove the Taylor expansion through iterative use of the fundamental theorem of calculus.
Best Answer
$n(n-1)...(n-(k-1))$ is the number of injective functions from a set of size $k$ to a set of size $n$. We can count these using inclusion-exclusion: first include all such functions, of which there are $n^k$. Then, for each transposition $(ij)$ in $S_k$, exclude all the functions such that $f(i) = f(j)$, of which there are $n^{k-1}$. And so forth. This alternating sum cancels out all functions which are invariant under a permutation of the domain, so the only ones left are the injective ones.
There's a much easier way to prove an equivalent identity, which is
$$\frac{1}{k!} \sum_{\pi \in S_n} n^{\text{cyc}(\pi)} = {n+k-1 \choose k}.$$
This identity is equivalent because the sign of a permutation is determined by the parity of its number of cycles, and it corresponds to replacing "antisymmetric" by "symmetric" everywhere in your question. But this identity has an obvious proof by Burnside's lemma: the LHS and RHS both count the number of orbits of functions $[k] \to [n]$ under permutation of the domain. (This is a special case of a result I call the baby Polya theorem.)
Both identities are in turn a special case of the exponential formula, which one can state as a generating function identity for the cycle index polynomials of the symmetric groups. I explain some of this here. The relevant specializations are
$$\frac{1}{(1 - t)^n} = \exp \left( nt + \frac{nt^2}{2} + ... \right)$$
and
$$(1 + t)^n = \exp \left( nt - \frac{nt^2}{2} \pm ... \right).$$