The standard, algorithmic/recipe, don't-think-too-much-about-it way of doing it is to compute the determinant of $A-\lambda I$ to get the characteristic polynomial, use the characteristic polynomial to find the eigenvalues, and then using each eigenvalue to compute the nullspace of $A-\lambda_iI$ to get the eigenvectors.
However, there are often sundry shortcuts. For example:
The trace of the matrix equals the sum of the eigenvalues (over the complex numbers) and the determinant equals to product of the eigenvalues. This can often help you find some, if not all, the eigenvalues.
If every row of the matrix adds up to the same constant $c$, then $c$ is an eigenvalue, and $(1,1,\ldots,1)$ is an eigenvector of $c$.
Since the eigenvalues of $A$ and the eigenvalues of $A^t$ are the same, if every column of $A$ adds up to the same constant $c$, then $c$ is an eigenvalue.
If $A$ has eigenvalue $\lambda$ with eigenvectors $\mathbf{b}_1,\ldots,\mathbf{b}_m$, then $aA+cI$ has eigenvalue $a\lambda+c$ with eigenvectors $\mathbf{b}_1,\ldots,\mathbf{b}_m$.
If $A$ is block-diagonal or block-triangular, then the eigenvalues of $A$ are the eigenvalues of the diagonal blocks.
For instance, your second matrix is the same as
$$\left(\begin{array}{cccc}
1&1&1&1\\
1&1&1&1\\
1&1&1&1\\
1&1&1&1
\end{array}\right) + 3I.$$
The first matrix has eigenvalues $0$ (because it's not invertible) and $4$ (because every row adds up to $4$), so your matrix will certainly have eigenvalues $0+3=3$ and $4+3=7$. That gives you two of the eigenvalues. From inspection, it is clear that the matrix with all $1$s has eigenvectors $(1,1,1,1)$ associated to $4$, and that $(1,-1,0,0)$, $(1,0,-1,0)$, and $(1,0,0,-1)$ are eigenvectors associated to $0$, and that's it, so these eigenvectors are also eigenvectors of your original matrix, associated to $7$ (the first one) and to $3$ (the second through fourth ones).
Your first matrix is block triangular (actually, triangular) so the eigenvalues are the eigenvalues of the diagonal bocks (here, the $1\times 1$ blocks $2$). So the only eigenvalue is $2$. There is an obvious eigenvector, $(0,0,1)$, and since the rank of $A-2I$ is $2$, the nullity is $1$ so that's the only eigenvector.
Most of these are heuristics, as opposed to algorithms. Of course, the "algorithmic" nature of "compute determinant, find characteristic polynomial, find eigenvalues, find eigenvectors" also includes a hidden heuristic component, in that factoring the polynomial often uses heuristics.
Finding all solutions of a general transcendental equation is a nontrivial task. As you have more than one equation this even looks hopeless ;-) So unless you know more about the equations and the allowed range for the variables (real, complex, from a compact set, ...), Mathematica (and this forum) cannot help.
For your specific problem, Mathematica can help. Bessel functions of half integer arguments can be reduced to trigonometric functions. Mathematica will help you with this task if you write 1/2 instead of 0.5. So for L=0, you obtain that the determinant is proportional to
$\sin[ (A_1 -A_2)k]$
and therefore the solutions are at $k= n \pi/(A_1-A_2)$ with $n\in \mathbb{Z}$. For $L=1,2,…$ the busyness gets more and more tricky, but numerical solutions are always obtainable.
E.g., for $L=1$ the determinant is zero whenever $(1+A_1 A_2 k^2) \sin[(A_1-A_2)k] - (A_1 -A_2) k \cos[(A_1-A_2)k]=0$. There is a trivial solution for $k=0$. For $|k| \geq (A_1 -A_2)/A_1 A_2$ this equation has no solutions. So you only need to find the solutions in a finite interval (except for $A_1 = A_2$ where the determinant is always zero). For this FindRoot helps.
Best Answer
To settle this:
Mathematica is quite capable of computing the eigenvalues of matrix pencils (i.e., the generalized eigenproblem).
Eigenvalues[]
/Eigenvectors[]
/Eigensystem[]
, as well asCharacteristicPolynomial[]
andSchurDecomposition[]
, are all able to handle matrix pencils, as long as the matrix contains inexact elements. For instance:Eigenvalues[]
and its cognate functions are capable of returning the first few or last few eigencomponents by taking a second integer parameter.Eigenvalues[{matA, matB}, 3]
for instance means "return the three largest (in magnitude) eigenvalues", andEigenvalues[{matA, matB}, -1]
means "return the tiniest eigenvalue".SchurDecomposition[]
is an eigenvalue-revealing decomposition. For a pencil $(\mathbf A,\mathbf B)$,SchurDecomposition[]
finds four matrices $\mathbf Q,\mathbf S,\mathbf P,\mathbf T$ such that $\mathbf A=\mathbf Q\mathbf S\mathbf P^{\dagger}$ and $\mathbf A=\mathbf Q\mathbf T\mathbf P^{\dagger}$, with $\mathbf Q,\mathbf P$ unitary/orthogonal and $\mathbf S,\mathbf T$ upper (quasi-)triangular (depending on the setting of the optionRealBlockDiagonalForm
), and $\dagger$ denoting a Hermitian (conjugate) transpose.There are applications where it is better to have a Schur decomposition than an eigendecomposition (and a Schur decomposition certainly takes less effort to compute, as well as less susceptible to numerical instability). Which of the eigendecomposition or the Schur decomposition should you use depends on what you really want to do, which you haven't mentioned...