Zonotope: Image of the Diagonals of a Hypercube Under a Linear Map

convex-geometrydiscrete geometryframe-theorylinear algebrapolytopes

Let $A \in \mathbb{R}^{m \times n}$ be a matrix where $m \leq n$, and let $H = [-1,1]^n$ be the unit hypercube. One can form the zonotope $\mathcal{Z}(A) = \{Ax : x \in H\} \subset \mathbb{R}^m$, which is the image of $H$ under $A$. I am looking for a sufficient condition on $A$ that achieves the following:

There are no opposing pairs of vertices in $H$ such that their image
in $\mathcal{Z}(A)$ lie on a line in $\mathbb{R}^m$ which passes
through two other points in $\mathcal{Z}(A)$ that are also the image
of a different opposing pair of vertices in $H$.

Ignore any opposing pair of vertices in $H$ which map to $0$ in $\mathcal{Z}(A)$. Basically, I am looking for a condition on $A$ so that the image of the diagonals of the hypercube in $\mathcal{Z}(A)$ "don't collide".

Example:

Let $A = \begin{pmatrix}
2 & 2 & 1 \\
1 & 4 & 2
\end{pmatrix}$
, so that $m = 2$, $n = 3$. The images of the opposing pairs of vertices of $H$ are:

$$\pm(1,1,1) \mapsto \pm (5,7)$$
$$\pm(1,-1,1) \mapsto \pm (1,-1)$$
$$\pm(-1,1,1) \mapsto \pm (1,5)$$
$$\pm(-1,-1,1) \mapsto \pm (-3,-3)$$

One checks that the four lines each passing through these pairs of image points in $\mathbb{R}^2$ are all distinct.

Example:

Let $A = \begin{pmatrix}
1 & 1 & 0 \\
0 & 1 & 1
\end{pmatrix}$
. The images of the opposing pairs of vertices of $H$ are:

$$\pm(1,1,1) \mapsto \pm (2,2)$$
$$\pm(1,-1,1) \mapsto (0,0)$$
$$\pm(-1,1,1) \mapsto \pm (0,2)$$
$$\pm(-1,-1,1) \mapsto \pm (-2,0)$$

We see that one opposing pair of vertices of $H$ was mapped to $0$. But ignoring this pair, the three lines each passing through the remaining pairs of image points in $\mathbb{R}^2$ are all distinct, so $A$ still satisfies the condition.

Non-Example:

Let $A = \begin{pmatrix}
1 & 1 & 0 \\
0 & 2 & 1
\end{pmatrix}$
. The images of the opposing pairs of vertices of $H$ are:

$$\pm(1,1,1) \mapsto \pm (2,3)$$
$$\pm(1,-1,1) \mapsto \pm (0,-1)$$
$$\pm(-1,1,1) \mapsto \pm (0,3)$$
$$\pm(-1,-1,1) \mapsto \pm (-2,-1)$$

We see that the line in $\mathbb{R}^2$ passing through $\pm (0,-1)$ is the same line passing through $\pm (0,3)$, so two diagonals of $H$ have "collided" and $A$ is a "degenerate" matrix.

If it helps, I am studying the case when $A$ is a matrix whose columns are vectors from an equiangular tight frame, and the result I want to prove is that there is only one way (up to sign) such that a signed sum of all but one of the vectors in the frames yields a scalar multiple of the remaining vector.

Best Answer

Let's denote the vertices of $H$ by $\partial^2 H$. Then what you want is that for all $x \in \partial^2 H$ there is no $y \in \partial^2 H$, such that $Ax$ and $Ay$ lie on the same line through the origin - so they should be linearly independent. We get these lines through the origin because the opposing vertices for each $x \in \partial^2 H$ is $-x$ and $A(-x) = -Ax$.

Note that the unit-cube in $\mathbb{R}^n$ has $2^n$ vertices, by identifying the opposing ones we cut this down to $2^{n-1}$ equivalence classes of antipodal vertices whose images under $A$ have to be pairwise linearly independent or get mapped to $0$ and thus land in the kernel of $A$.

By the above established equivalence relation we can identify each class with an element $(1,x_2,...,x_n) \in \partial^2 H$ (so the antipodal identification eliminates one choice in our string of $1$s and $-1$s). So a basic requirement is that for all $x'=(x_2,...,x_n)^T, y'=(y_2,...,y_n), x_j,y_j \in \{-1,1\}, x:= (1, x'^T)^T, y:= (1, y'^T)^T$ we have $$ aAx + bAy = 0 \iff (a=b=0 \quad \text{or} \quad x \in \ker{A}). $$

By considering some special cases you can get direct requirements for $A$; if you want some vertex to take "the kernel case" the requirement for $A$ is $Ax=0$, and if you want it to be linearly independent with the others you can get requirements from that. For example let $x$ be defined as above and let $y' = -x'$. Decompose $A$ as follows: $$ A = \begin{pmatrix} a_{11} & a_{1\cdot}' \\\ a_{\cdot1}' & A' \end{pmatrix} $$ where $a_{11} \in \mathbb{R}, a_{1\cdot}' \in \mathbb{R}^{1,n-1}$ etc. Then $$ Ax + Ay = \begin{pmatrix} a_{11} & a_{1\cdot}' \\\ a_{\cdot1}' & A' \end{pmatrix} \begin{pmatrix} 1 \\\ x'\end{pmatrix} + \begin{pmatrix} a_{11} & a_{1\cdot}' \\\ a_{\cdot1}' & A' \end{pmatrix} \begin{pmatrix} 1 \\\ -x'\end{pmatrix} = \begin{pmatrix} a_{11} \\\ a_{\cdot 1}' \end{pmatrix} $$ for example should not equal 0. So for this case we have either $Ax = 0$ or $a_{\cdot 1} \neq 0 \in \mathbb{R}^{m,1}$. You can of course do the same for the other columns.

Are these the kind of conditions you had in mind?

Related Question