Adding or subtracting a row of a matrix from another does not change its determinant, so we may assume each column of the matrix has at most one entry that is 1.
Swapping rows of a matrix changes the sign of the determinant only; so if we perform row swaps so that the resulting matrix is diagonal, we'll have determined the determinant up to a sign.
So now we have a diagonal matrix whose diagonal entries are either 1 or 0.
The determinant of this matrix must be $0$ or $1$; and hence, the determinant of the original matrix must be $0$, $1$, or $-1$.
(The $-1$ possibility can arise: start with the identity matrix and interchange the last two rows. The 0 possibility can arise: start with a matrix whose first column is all $1$'s. And, of course, the identity matrix shows that $1$ is a possible value of the determinant.)
For aesthetic considerations suggested by Michael Hoppe, I'll rename $K$ into $n$ and $y_{K+1}$ into $x_{n+1}$, so the matrix whose determinant you're searching is
$$A=\begin{pmatrix}
x_1 & 0 & \dots & 0 & y_1 \\
0 & x_2 & \dots & 0 & y_2 \\
\vdots & \vdots & \ddots & \vdots & \vdots \\
0 & 0 & \dots & x_n & y_n \\
y_1 & y_2 & \dots & y_n & x_{n+1}
\end{pmatrix}.
$$
First method. Use determinant expansion with respect to the last column to get
$$\mathrm{det}(A) = x_1...x_n x_{n+1} + \sum_{i=1}^n(-1)^{n+1+i} \mathrm{det}(A_i)$$
where $A_i$ is the matrix $A$ deprived of its last column and $i$-th row. For example,
$$A_1 = \begin{pmatrix}0& x_2 &0& ... &0 \\
0 & 0 & x_3 & ... & 0 \\
\vdots & & & \ddots& \vdots\\
0 & & ... & & x_n\\
y_1 & &... && y_n \\
\end{pmatrix}. $$
Using row expansion for $A_i$, it is easy to see that $\mathrm{det}(A_i)$ is equal to $(-1)^{n+i} y_i \prod_{j \neq i} x_j$, which yelds
\begin{align*}\mathrm{det}(A) &= \prod_{i=1}^{n+1} x_i - \sum_{i=1}^n y_i^2 \prod_{j \neq i, j\leqslant n}x_j \\
&= \prod_{i=1}^{n+1}x_i \left( 1 - \sum_{i=1}^n \frac{y_i^2}{x_i}\right).
\end{align*}
Edit (second method, hint). This last expression suggests another method (maybe it's not working) : suppose that no $x_i$ is zero. Note $X = \mathrm{diag}(x_1, ..., x_{n+1})$ and $Y = A - X$, so that $A = X+Y = X(\mathrm{Id}+X^{-1}Y)$. Then, $\mathrm{det}(A) = \mathrm{det}(X) \mathrm{det}(\mathrm{Id} - X^{-1}Y)$. Now, all you have to do is to compute $\mathrm{det}(\mathrm{Id} - X^{-1}Y)$. Maybe there's a simple ay of doing this (I don't know).
Best Answer
The matrix $$\begin{pmatrix}1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9\end{pmatrix}$$works just fine :