We are asking the question, what is the set of all matrices $A$ fulfilling these conditions: $\Gamma(A) \cap Q = \{0\}$ and $\Gamma(A) \cap Q' = \{0\}$.
From the facts that $V = P \oplus Q$ and $A$ is a linear map $A\colon P \to Q$ follows, that any matrix $A_{(n-k)\times k}$ fulfills the first condition.
Let $p_1, \ldots, p_k, q_1, \ldots , q_{n-k}$ be the base of $V$, created of bases of $P$ and $Q$.
Let $M$ be the next corresponding matrix to matrix $A$: $M=\left(\begin{matrix} 0_{k \times k} & 0_{k \times (n-k)} \\ A_{(n-k)\times k} & 0_{(n-k)\times (n-k)} \end{matrix} \right)$. In fact, $M\colon V \to V$ is a linear map that extends the map $A$ from the domain $P$ to $V$. This extension is done for using the square matrices in the next.
The vector $x$ belongs to the set $\Gamma(A) \cap Q'$ if $M.x + I.x = B.c$ and $I.x=C.d$, where columns of matrix $B$ ($C$) consists of base vectors of $Q'$ ($P$) and $c$ and $d$ are vector of coefficients in the linear combination.
It is system of linear equations ($2n$ equations). Entries of vectors $x, c, d$ are unknowns.
The system can be formally written as:
$$ \left(\begin{matrix} (M+I)_{n \times n} & -B_{n \times (n-k)} & O_{n \times k} \\ I_{n\times n} & 0_{n\times (n-k)} & -C_{n \times k} \end{matrix} \right).\left( \begin{matrix} x_{n \times 1} \\ c_{(n-k)\times 1} \\ d_{k \times 1} \end{matrix} \right) = 0$$
This system is required to have the only one solution. So the determinant has to be nonzero.
Since matrix $ \left(\begin{matrix} M+I & -B & O \\ I & 0 & -C \end{matrix} \right)$ consists of constants (entries of $I, B, C$, and zeroes) and variables (entries of matrix $A$, resp. $M$) and determinant function is continuous, the set of all searched matrices $A$ is open. (It is the preimage of $\mathbb{R}-\{0\}$ in this determinant function.)
To understand how to choose between Grassmann and Stiefel manifolds, it helps to understand better the difference. In the following I will only look at the case of real vector space $\mathbb{R}^n$. Much information can be found at https://en.wikipedia.org/wiki/Stiefel_manifold and https://en.wikipedia.org/wiki/Grassmannian.
The Grassmannian $Gr(k,n)$ is the (compact) manifold of all $k$-dimensional linear subspaces in $\mathbb{R}^n$. So the one-dimensional case $k=1$ is real projective space. The Stiefel manifold $V_k(n)$ is (compact) manifold of all (orthogonal) $k$-frames (hereafter we leave out "orthogonal" which is always implied). A $k$-frame is a set of $k$ orthonormal vectors in $\mathbb{R}^n$, so can alternatively be described as the manifold of $n\times k$ column orthogonal matrices.
Let us compare the definitions for a few low-dim cases:
-
$Gr(1,n)$ and $V_1(n)$
$Gr(1,n)$ is the set of all line through the origin, while $V_1(n)$ is the set of all unit vectors, so is the unit sphere. To each point in $Gr(1,n)$ there corresponds two (antipodal) unit vectors in $V_1(n)$.
-
$Gr(2,n)$ and $V_2(n)$
$Gr(2,n)$ is the set of all 2-planes (through the origin) while $V_2(n)$ is all 2-frames in $\mathbb{R}^n$. To each point in $Gr(2,n)$ (that is, plane) there corresponds in $V_2(n)$ the set of all orthogonal bases for that plane.
-
General case
There is a natural projection $p\colon V_r(n) \mapsto Gr(r,n)$ which to each frame in $V_r(n)$ sends it to the plane in $Gr(r,n)$ of which it is a basis. In this way, we can define $Gr(r,n)$ as a quotient space of $V_r(n)$.
What does all this mean for your optimization problem? If you are only interested in linear subspaces, use $Gr(r,n)$, but if how you parametrize the space with an orthogonal basis is important, use the Stiefel manifold.
But, algorithmically, it could be easier to represent a Stiefel manifold, so you can use that but build into the step-finding algorithm that you avoid walking to a new frame which is a basis for the same subspace. See Optimization Algorithms on Matrix Manifolds
Best Answer
It is enough to check that the preimage $A = \pi^{-1}(B)$ in $X_{n,k}$ is dense ($\pi(\bar A) \subset \overline{\pi(A)}$ for continuous $\pi$). Now reduce to the case $W = \{0\}_k \times \mathbb{R}^{n-k}$. Then $\pi^{-1}(B)$ are the $n\times k$ matrices with the leading $k\times k$ minor $\ne 0$. This is clearly dense in $X_{n,k}$, since it is dense in the set of all $n\times k$ matrices.
It is interesting to consider Plücker coordinates. The condiition $V\cap W = 0$ is a linear condition $\ne 0$ in the Plücker coordinates of $V$.