You matrix $A = (a_{ij})$ is
an upper triangular matrix
( $a_{ij} = 0$ whenever $i > j$ )
and a Toeplitz matrix
( $a_{ij}$ depends only on $i-j$ ) at the same time.
I cannot find any reference online which teach you how to evaluate its inverse effectively.
I hope this keywords can help you in your own search.
If you just want an inverse without too many other concerns, it is actually
pretty easy to get the inverse ourselves.
Let $\eta$ be the $n \times n$ matrix with $1$ on its superdiagonal and $0$ otherwise. i.e.
$$\eta = (\eta_{ij}),\quad \eta_{ij} = \begin{cases}1,& i - j = -1\\0,& \text{otherwise}\end{cases}$$
We have $\eta^n = 0$ and we can express $A$ as a polynomial in $\eta$.
$$A = x_1 I + x_2 \eta + x_3 \eta^2 + \cdots + x_n \eta^{n-1}$$
$A$ will be invertible when and only when $x_1$ is non-zero. When $A$ is invertible,
$A^{-1}$ is also an upper triangular Toeplitz matrix. We can also represent it as a polynomial in $\eta$.
Introduce numbers $\displaystyle\;\alpha_i = \frac{x_{i+1}}{x_1}$ and $\beta_i$ ( $i = 1,\ldots,n-1$ ) such that
$$\begin{align}
A &= x_1 \left(I + \alpha_1 \eta + \alpha_2 \eta^2 + \cdots + \alpha_{n-1} \eta^{n-1}\right)\\
A^{-1} &= x_1^{-1} \left(I + \beta_1 \eta + \beta_2 \eta^2 + \cdots + \beta_{n-1} \eta^{n-1}\right)
\end{align}
$$
The condition $A^{-1} A = I$ can be expanded to following set of relations. They will alow you to compute $\beta_k$ in a recursive manner.
$$\begin{align}
-\beta_1 &= \alpha_1\\
-\beta_2 &= \alpha_1 \beta_1 + \alpha_2\\
&\;\vdots\\
-\beta_k &= \alpha_1 \beta_{k-1} + \alpha_2 \beta_{k-2} + \cdots + \alpha_k\\
&\;\vdots
\end{align}
$$
When $n$ is small and you want individual $\beta_k$ as a function of $\alpha_k$.
There is actually a trick to get it. You can ask a CAS to compute the Taylor
expansion of the reciprocal of following polynomial in $t$:
$$\frac{1}{1 + \alpha_1 t + \alpha_2 t^2 + \cdots + \alpha_{n-1} t^{n-1}}
= 1 + \beta_1 t + \beta_2 t^2 + \cdots + \beta_{n-1} t^{n-1} + O(t^n)$$
The coefficients of $t^k$ ($1 \le k < n$) in the resulting Taylor expansion will
be the expression you want for $\beta_k$. e.g.
$$\begin{align}
\beta_1 &= -\alpha_1,\\
\beta_2 &= \alpha_1^2 - \alpha_2,\\
\beta_3 &= -\alpha_1^3 + 2\alpha_1\alpha_2 - \alpha_3,\\
\beta_4 &= \alpha_1^4 - 3\alpha_1^2\alpha_2 + \alpha_2^2 + 2\alpha_1 \alpha_3 - \alpha_4\\
&\;\vdots
\end{align}$$
For aesthetic considerations suggested by Michael Hoppe, I'll rename $K$ into $n$ and $y_{K+1}$ into $x_{n+1}$, so the matrix whose determinant you're searching is
$$A=\begin{pmatrix}
x_1 & 0 & \dots & 0 & y_1 \\
0 & x_2 & \dots & 0 & y_2 \\
\vdots & \vdots & \ddots & \vdots & \vdots \\
0 & 0 & \dots & x_n & y_n \\
y_1 & y_2 & \dots & y_n & x_{n+1}
\end{pmatrix}.
$$
First method. Use determinant expansion with respect to the last column to get
$$\mathrm{det}(A) = x_1...x_n x_{n+1} + \sum_{i=1}^n(-1)^{n+1+i} \mathrm{det}(A_i)$$
where $A_i$ is the matrix $A$ deprived of its last column and $i$-th row. For example,
$$A_1 = \begin{pmatrix}0& x_2 &0& ... &0 \\
0 & 0 & x_3 & ... & 0 \\
\vdots & & & \ddots& \vdots\\
0 & & ... & & x_n\\
y_1 & &... && y_n \\
\end{pmatrix}. $$
Using row expansion for $A_i$, it is easy to see that $\mathrm{det}(A_i)$ is equal to $(-1)^{n+i} y_i \prod_{j \neq i} x_j$, which yelds
\begin{align*}\mathrm{det}(A) &= \prod_{i=1}^{n+1} x_i - \sum_{i=1}^n y_i^2 \prod_{j \neq i, j\leqslant n}x_j \\
&= \prod_{i=1}^{n+1}x_i \left( 1 - \sum_{i=1}^n \frac{y_i^2}{x_i}\right).
\end{align*}
Edit (second method, hint). This last expression suggests another method (maybe it's not working) : suppose that no $x_i$ is zero. Note $X = \mathrm{diag}(x_1, ..., x_{n+1})$ and $Y = A - X$, so that $A = X+Y = X(\mathrm{Id}+X^{-1}Y)$. Then, $\mathrm{det}(A) = \mathrm{det}(X) \mathrm{det}(\mathrm{Id} - X^{-1}Y)$. Now, all you have to do is to compute $\mathrm{det}(\mathrm{Id} - X^{-1}Y)$. Maybe there's a simple ay of doing this (I don't know).
Best Answer
I guess that maybe OP can refer to ``Eigenvalues of a matrix with only one non-zero row and non-zero column.''
Alternatively, we can directly compute the determinant of $|\lambda I - A|$. we can rewrite the matrix $\lambda I - A$ as block matrix: \begin{equation} \lambda I - A = \begin{bmatrix} \lambda & 0 & 0 & \ldots & x_{1} \\ 0 & \lambda & 0 & \ldots & x_{2} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ x_{1} & x_{2} & x_{3} & \ldots & \lambda-x_{n}\end{bmatrix} = \begin{bmatrix} D & u \\ v & \lambda-x_n \end{bmatrix} \end{equation} where $v=u^T = (x_1,x_2,\dots,x_{n-1})$, and $D=\operatorname{diag}(\lambda, \dots, \lambda)$. And the determinant is \begin{equation} \begin{aligned} \operatorname{det}(\lambda I - A) &= \operatorname{det}(D)\operatorname{det}(\lambda-x_n-vD^{-1}u) \\ &= \lambda^{n-1}(\lambda-x_n-\lambda^{-1}\sum_{i=1}^{n-1}x_i^2) \end{aligned} \end{equation} Let the determinant equal to zero, we can compute two non-zero eigenvalues: \begin{equation} \lambda = \frac{x_n \pm \sqrt{x_n^2+4\sum_{i=1}^{n-1}x_i^2}}{2} \end{equation} The eigenvectors are (before normalization) \begin{equation} v_1 = \begin{bmatrix} \frac{u}{\lambda_1} \\ 1 \end{bmatrix}, \quad v_2 = \begin{bmatrix} \frac{u}{\lambda_2} \\ 1 \end{bmatrix} \end{equation}