Given a $n\times n$ matrix whose $(i, j)$-th entry is the lower of $i,j$, eg.
$$\begin{pmatrix}1 & 1 & 1 & 1\\
1 & 2 & 2 & 2 \\
1 & 2 & 3 & 3\\
1 & 2 & 3 & 4 \end{pmatrix}.$$
The determinant of any such matrix is $1$.
How do I prove this?
Tried induction but the assumption would only help me to compute the term for $A_{nn}^*$ mirror.
Linear Algebra – Determinant of Matrix with $A_{ij} = \min (i, j)$
determinantlinear algebramatrices
Related Solutions
We will generalize Calvin Lin's answer a bit. Let $$A_n = \begin{bmatrix} a & b & 0 & 0 & \cdots & 0\\ c & a & b & 0 & \cdots & 0\\ 0 & c & a & b & \cdots & 0\\ \vdots & \vdots & \vdots & \ddots & \cdots & 0\\ \vdots & \vdots & \vdots & \vdots & a & b\\ 0 & 0 & 0 & \cdots & c & a \end{bmatrix}.$$
We then have, by using Laplace expansion twice, $$\det(A_n) = a \det(A_{n-1}) - bc \det(A_{n-2}).$$
Calling $\det(A_n) = d_n$ we have the following linear homogeneous recurrence relation: $$d_n = a d_{n-1} - bc d_{n-2}.$$
The characteristic equation is $$\begin{align} x^2 - ax + bc = 0 & \implies \left(x - \frac{a}2 \right)^2 - \left(\frac{a}2 \right)^2 + bc = 0 \\ & \implies x = \frac{a \pm \sqrt{a^2-4bc}}2. \end{align}$$
(This assumes a square roots exist. It's always the case in $\mathbb{C}$.)
Case 1: $a^2 - 4bc \neq 0$
In this case the characteristic polynomial has two distinct roots, so we have (for some constants $k_1$, $k_2$): $$d_n = k_1 \left( \dfrac{a + \sqrt{a^2-4bc}}2\right)^n + k_2 \left( \dfrac{a - \sqrt{a^2-4bc}}2\right)^n.$$
We have $d_1 = a$ and $d_2 = a^2 - bc$. We then get that $d_0 = 1$. Hence, $$k_1 + k_2 = 1.$$ $$a (k_1 + k_2) + (k_1 - k_2)\sqrt{a^2-4bc} = 2a \implies k_1 - k_2 = \dfrac{a}{\sqrt{a^2-4bc}}.$$
Hence, $$\begin{align} k_1 & = \dfrac{a + \sqrt{a^2-4bc}}{2\sqrt{a^2-4bc}}, & k_2 & = -\dfrac{a-\sqrt{a^2-4bc}}{2\sqrt{a^2-4bc}} \end{align}$$
And finally: $$\color{red}{\det(A_n) = \dfrac1{\sqrt{a^2-4bc}} \left( \left( \dfrac{a + \sqrt{a^2-4bc}}2\right)^{n+1} - \left( \dfrac{a - \sqrt{a^2-4bc}}2\right)^{n+1}\right)}.$$
Plug in $a = 5$ and $b=c=2$ ($a^2 - 4 bc \neq 0$), to get $$\det(A_n) = \frac{1}{3} ( 4^{n+1} - 1)$$
Case 2: $a^2 - 4bc = 0$
If the characteristic polynomial has a double root $x = a/2$, there exist constants $k_1$, $k_2$ such that: $$d_n = (k_1 + k_2 n) \bigl(\frac{a}{2}\bigr)^n.$$
The initial conditions are $d_0 = 1$ and $d_1 = a$, thus: $$\begin{align} k_1 & = 1 & (k_1 + k_2) a = 2a \end{align}$$
If $a = 0$, then $4bc = a^2$ implies either $b$ or $c$ is zero, and $d_n = 0$ for $n \ge 1$. Otherwise $$(k_1 + k_2) a = 2a \implies k_1 + k_2 = 2 \implies k_2 = 1.$$ And finally: $$\color{red}{\det(A_n) = (n+1) \bigl(\frac{a}{2}\bigr)^n}.$$
You can actually define the cross product of two vectors $\mathbf{a}, \mathbf{b} \in \mathbb{R}^3$ to the be unique vector $\mathbf{a} \times \mathbf{b} \in \mathbb{R}^3$ such that $$ \forall \mathbf{c} \in \mathbb{R}^3, \quad (\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c} = \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c}), $$ where $(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c})$ denotes the $3 \times 3$ matrix whose columns are $\mathbf{a},\mathbf{b},\mathbf{c}$ in that order. In particular, you can recover $\mathbf{a} \times \mathbf{b}$ as $$ \mathbf{a} \times \mathbf{b} = \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{i})\mathbf{i} + \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{j})\mathbf{j} + \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{k})\mathbf{k}, $$ which can be massaged using determinant identities to give you the usual ghastly explicit formula; in the special case that $\mathbf{a}$ and $\mathbf{b}$ lie in the $xy$-plane, you immediately recover your observation above. Moreover, it immediately follows that $\mathbf{a} \times \mathbf{b}$ is perpendicular to $\mathbf{a}$, $\mathbf{b}$, and any linear combination of $\mathbf{a}$ and $\mathbf{b}$, since by basic determinant identities, including the fact that a square matrix with repeated columns has a vanishing determinant, $$ (\mathbf{a} \times \mathbf{b}) \cdot (s\mathbf{a}+t\mathbf{b}) = \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,s\mathbf{a}+t\mathbf{b}) = s\det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{a}) + t\det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{b}) = 0. $$ Anyhow, the point of all this you're comfortable with basic linear algebra, especially with how the determinant behaves under elementary row and column operations, then you can derive the identity $$ \|\mathbf{a} \times \mathbf{b}\| = \|\mathbf{a}\|\,\|\mathbf{b}\|\sin\theta_{\mathbf{a},\mathbf{b}} $$ from the identity $$ \forall \mathbf{c} \in \mathbb{R}^3, \quad (\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c} = \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c}), $$ without too much trouble.
For simplicity, let's assume that $\mathbf{a} \neq \mathbf{0}$ and $\mathbf{b} \neq \mathbf{0}$; otherwise the claim is trivial. Actually, let's show that for any $\mathbf{c} \in \operatorname{Span}\{\mathbf{a},\mathbf{b}\}^\perp$, i.e., for any $\mathbf{c}$ perpendicular to both $\mathbf{a}$ and $\mathbf{b}$, that $$ \left\lvert(\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c}\right\rvert = \left(\|\mathbf{a}\|\,\|\mathbf{b}\|\sin(\theta_{\mathbf{a},\mathbf{b}})\right)\|\mathbf{c}\|. $$ If $\mathbf{a} \times \mathbf{b} \neq \mathbf{0}$, then we can plug in $\mathbf{c} = \mathbf{a} \times \mathbf{b}$ to get $$ \|\mathbf{a} \times \mathbf{b}\|^2 = \left\lvert(\mathbf{a} \times \mathbf{b}) \cdot (\mathbf{a} \times \mathbf{b})\right\rvert = \left(\|\mathbf{a}\|\,\|\mathbf{b}\|\sin(\theta_{\mathbf{a},\mathbf{b}})\right)\|\mathbf{a}\times\mathbf{b}\|, $$ and hence $$ \|\mathbf{a} \times \mathbf{b}\| = \|\mathbf{a}\|\,\|\mathbf{b}\|\sin\theta_{\mathbf{a},\mathbf{b}} $$ If $\mathbf{a} \times \mathbf{b} = \mathbf{0}$, then since $\operatorname{Span}\{\mathbf{a},\mathbf{b}\}^\perp$ is at least $1$-dimensional, take any non-zero vector $\mathbf{c} \in \operatorname{Span}\{\mathbf{a},\mathbf{b}\}^\perp$ to get $$ 0 = \left\lvert(\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c}\right\rvert = \left(\|\mathbf{a}\|\,\|\mathbf{b}\|\sin(\theta_{\mathbf{a},\mathbf{b}})\right)\|\mathbf{c}\|, $$ which yields $\sin(\theta_{\mathbf{a},\mathbf{b}}) = 0$ and hence $$ \|\mathbf{a} \times \mathbf{b}\| = 0 = \|\mathbf{a}\|\,\|\mathbf{b}\|\sin\theta_{\mathbf{a},\mathbf{b}}. $$
First, by the defining identity for cross products, $$ (\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c} = \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c}). $$
Next, since determinants are preserved under column additions (e.g., $\det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c}) = \det(\mathbf{a}\,\vert\,\mathbf{b}+s\mathbf{a}\,\vert\,\mathbf{c})$), we have that $$ \det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c}) = \det(\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c}), $$ where $$ \mathbf{b}^\prime := \mathbf{b} - \frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|^2}\mathbf{a} $$ is the orthogonal projection of $\mathbf{b}$ onto $\operatorname{Span}\{\mathbf{a}\}^\perp$, i.e., onto the plane through the origin with normal vector $\mathbf{a}$; geometrically, if you believe that $\det(\mathbf{a}\,\vert\,\mathbf{b}\,\vert\,\mathbf{c})$ is the signed volume of the parallelepiped spanned by $\mathbf{a},\mathbf{b},\mathbf{c}$, then we're essentially saying that the parallelpiped spanned by $\mathbf{a},\mathbf{b},\mathbf{c}$ has the same volume as the paralleliped spanned by spanned by $\mathbf{a},\mathbf{b}^\prime,\mathbf{c}$ by Cavalieri's principle. Observe, in particular, that $\mathbf{a}$, $\mathbf{b}^\prime$, and $\mathbf{c}$ are pairwise orthogonal by construction.
Next, since $\mathbf{a}$, $\mathbf{b}^\prime$, and $\mathbf{a} \times \mathbf{b}$ are pairwise orthogonal, $$ \lvert\det(\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c})\rvert = \sqrt{\det(\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c})^2}\\ = \sqrt{\det\left((\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c})^T\right) \det(\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c})}\\ = \sqrt{\det\left((\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c})^T(\mathbf{a}\,\vert\,\mathbf{b}^\prime\,\vert\,\mathbf{c}) \right)}\\ = \begin{vmatrix}\|\mathbf{a}\|^2&0&0\\0&\|\mathbf{b}^\prime\|^2&0\\0&0&\|\mathbf{c}\|^2\end{vmatrix}^{1/2}\\ = \|\mathbf{a}\|\,\|\mathbf{b}^\prime\|\,\|\mathbf{c}\|. $$
At last, since the angle $\theta_{\mathbf{a},\mathbf{b}} \in [0,\pi]$ between the non-zero vectors $\mathbf{a},\mathbf{b}$ is given by the formula $$ \cos \theta_{\mathbf{a},\mathbf{b}} = \frac{\mathbf{a}\cdot\mathbf{b}}{\|\mathbf{a}\|\,\|\mathbf{b}\|} $$ it follows that $$ \|\mathbf{b}^\prime\|^2 = \left(\mathbf{b} - \frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|^2}\mathbf{a}\right) \cdot \left(\mathbf{b} - \frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|^2}\mathbf{a}\right)\\ =\|\mathbf{b}\|^2 - 2 \mathbf{b} \cdot \frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|^2}\mathbf{a} + \left\|\frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|^2}\mathbf{a}\right\|^2\\ =\|\mathbf{b}\|^2 - \frac{(\mathbf{a} \cdot \mathbf{b})^2}{\|\mathbf{a}\|^2}\\ =\|\mathbf{b}\|^2\left(1 - \left(\frac{\mathbf{a}\cdot\mathbf{b}}{\|\mathbf{a}\|\,\|\mathbf{b}\|}\right)^2\right)\\ =\|\mathbf{b}\|^2(1-\cos^2\theta_{\mathbf{a},\mathbf{b}})\\ =\|\mathbf{b}\|^2\sin^2\theta_{\mathbf{a},\mathbf{b}}, $$ and hence that $$ \left\lvert(\mathbf{a} \times \mathbf{b}) \cdot \mathbf{c}\right\rvert = \|\mathbf{a}\|\,\|\mathbf{b}^\prime\|\,\|\mathbf{c}\| = \|\mathbf{a}\|\,\|\mathbf{b}\|\sin\left(\theta_{\mathbf{a},\mathbf{b}}\right)\|\mathbf{c}\|, $$ as was claimed.
Best Answer
You can substract the $j$-th column to the $(j+1)$-th one. This will leave you with a lower-triangular matrix of all ones.