In an 2n-dimensional space the space of block matrices of the form
0 A*
A 0
have n positive and n negative eigenvalues.
They are plus or minus the singular values of A.
(Meaning eigenvalues of |A|=(A*A)^(1/2)).
(This fact is in Bhatia's matrix analysis book.)
I would like to give some details in order to make clear that one can give a proof with hardly any computations at all (I have never looked at the Bourbaki presentation but I guess they make the same point though, because they want to make all proofs only depend on previous material they might make a few more computations).
To begin with we are considering supercommutative algebras (over a commutative base ring $R$) which are strictly commutative, i.e., $\mathbb Z/2$-graded algebras with $xy=(-1)^{mn}yx$, where $m$ and $n$ are the degrees of $x$ and $y$ and $x^2=0$ if $x$ has odd degree. Note that the base extension of such an algebra is of the same type (the most computational part of such a verification is for the strictness which uses that on odd elements $x^2$ is a quadratic form with trivial associated bilinear form). The exterior algebra is then the free strictly supercommutative algebra on the module $V$. Furthermore, the (graded) tensor product of two sca's is an sca and from that it follows formally that $\Lambda^\ast(U\bigoplus V)=\Lambda^\ast U\bigotimes\Lambda^\ast V$.
Now, the diagonal map $U\to U\bigoplus U$ then induces a coproduct on $\Lambda^\ast U$ which by functoriality is cocommutative making the exterior algebra into superbialgebra. I also want to make the remark that if $U$ is f.g. projective then so is $\Lambda^\ast U $. Indeed, by presenting $U$ as a direct factor ina free f.g. module one reduces to the case when $U$ is free and then by the fact that direct sums are taken to tensor products to the case when $U=R$ but in that case it is clear that $\Lambda^\ast U=R\bigoplus R\delta$ where $\delta$ is an
odd element of square $0$.
If now $U$ is still f.g. projective then $(\Lambda^\ast U^\ast)^\ast$ is also a supercommutative and supercocommutative superbialgebra. We need to know that it is strictly supercommutative. This can be done by a calculation but we want to get by with as much handwaving as possible. We can again reduce to the case when $U$ is free. After that we can reduce to the case when $U=R$ (using that the tensor product of sca's is an sca) and there it is clear (and one can even in that case avoid computations). Another way, once we are then free case, is to use the base change property to reduce to the when $R=\mathbb Z$ and then we have seen that the exterior algebra is torsion free and a torsion free supercommutative algebra is an sca.
We also need to know that if $U$ is f.g. projective, then the structure map $U\to\Lambda^\ast U$ has a canonical splitting. This is most easily seen by introducing a $\mathbb Z$-grading extending the $\mathbb Z/2$-grading and showing that the degree $1$ part of it is exactly $U$ (maybe better would have to work from the start with $\mathbb Z$-gradings).
This splitting gives us a map $U\to(\Lambda^\ast U^\ast)^\ast$ into the odd part and as we have an sca we get a map $\Lambda^\ast U\to(\Lambda^\ast U^\ast)^\ast$ of sca's. This map is an isomorphism. Indeed, we first reduce to the case when $R$ is free (this is a little bit tricky if one wants to present $U$ as a direct factor of free but one may also use base change compatibility to reduce to the case when $R$ is local in which case $U$ is always free). Then as before one reduces to $U=R$ where itt is clear as both sides are concentrated in degrees $0$ and $1$ and the map clearly is an iso in degree $0$ and by construction in degree $1$.
Note that everything works exactly the same if one works with ordinary commutativity (so that the exterior algebra is replaced by the symmetric one) up till the verification in the case when $U=R$. In that case the dual algebra is the divided power algebra and we only have an isomorphism in characteristic $0$.
Afterthought: The fact that things work for the exterior algebra but introduces divided powers in the symmetric case has an intriguing twist. One can define divided power superalgebras where all higher powers of odd elements are defined to be zero. With that definition the exterior algebra has a canonical divided power structure (for a projective module) which is characterised by being natural and commuting with base change (or being compatible with tensor products, just as for ordinary divided powers there is a natural divided power on the tensor product of two divided power algebras). Hence, somehow the fact that the exterior algebra is selfdual is connected with the fact that the exterior algebra is also the free divided power superalgebra on odd elements. However, I do not know of any a priori reason why the dual algebra, both in the symmetric and exterior case, should have a divided power structure.
As a curious aside, the divided power structure on the exterior algebra is related to the Riemann-Roch formula for abelian varieties: If $L$ is a line bundle on an abelian variety the RR formula says that
$$
\chi(L) = \frac{L^g}{g!}
$$
and this gives (as usual for Aityah-Singer index formulas) also the integrality statement that $L^g$ is divisible by $g!$. However, that follows directly from the fact that the cohomology of an abelian variety is the exterior algebra on $H^1$ and the fact that the exterior algebra has a divided power structure.
Best Answer
Here's another counter-example, taken from Loop Groups (p 128).
Consider the space of continuous functions on the circle and define
$$ f(\theta) = \sum_{k \gt 1} \frac{\sin k \theta}{k \log k} $$
The positive part of this function is
$$ f_+(\theta) = \frac{1}{2 i} \sum_{k \gt 1} \frac{e^{i k \theta}}{k \log k} $$
which is unbounded near $\theta = 0$.
Let's move on to the other part of your question: when can we ensure that the splitting exists? What you are asking for is that $V$ be the direct sum of two subspaces, $V_-$ and $V_+$. Of course, if you start with two abstract vector spaces, $V_-$ and $V_+$, both of which admit symmetric, positive definite bilinear forms, say $b_-$ and $b_+$ respectively, then you can construct an example by taking $V = V_- \oplus V_+$ and taking $-b_- + b_+$. This shows that you can get this situation to work with quite awful spaces, but the point is that all the awfulness of $V$ divides nicely into awfulness of $V_-$ plus awfulness of $V_+$.
Presumably, though, you are more interested in the case where you start with $V$ and the quadratic form. Maybe this quadratic form can be fairly arbitrary (perhaps varies in some space of quadratic forms). In this situation, you would want conditions on $V$ that guarantee that the splitting occurs without too much fuss.
Let's examine the question from the other end: suppose that $V = V_- \oplus V_+$. Then by changing the sign of the form on $V_-$, we obtain a positive-definite symmetric bilinear form on $V$. This usually goes by the name of an inner product as we're over $\mathbb{R}$. So the problem reduces to finding complements of subspaces in inner product spaces. To guarantee this, you want completeness. Then your bilinear form is related to the original inner product by the operator $2P_+ - I$ where $P_+$ is the orthogonal projection on to $V_+$.
So what you want is to be working with a Hilbert space and the space of self-adjoint square-roots of the identity.
As I said, this isn't an "if and only if". But it is a simple condition that quite often holds. It can be further relaxed since it's enough that the inner product induced by the bilinear form and the original inner product be merely equivalent rather than equal, but I'll leave those details as an exercise.
Edit: From your other question related to this it seems as though you are particularly interested in the case where one of the factors is finite dimensional. In that case, the splitting always holds.