Calculus – How to Find the Equation of a Decision Boundary in Logistic Regression

calculuslogistic regressionmachine learningstatistics

I'm new to machine learning and currently working on logistic regression. but i don't know how to deal this problem. let us consider the logistic regression for a dataset $(x_n,y_n)\ (x_i \in \mathbb R^d, y_i \in \{+1,-1\})$, let $\Phi(x)=(\phi_p(x))^T$ be a p-dimensional vector of functions and $x\in\mathbb R^d$ be a parameter vector. A probabilistic model is defined as $$p(y|x)=\frac{1}{1+\exp(-y\theta^T\Phi(x))}$$
$$(p(+1|x)=\frac{1}{1+\exp(-\theta^T\Phi(x))}=\frac{\exp(\theta^T\Phi(x))}{1+\exp(\theta^T\Phi(x))},p(-1|x)=\frac{1}{1+\exp(\theta^T\Phi(x))})$$
and the logistic regression estimates the parameter $\theta$ by maximizing $L(\theta)=\sum_{i=1}^n\log p(y_i|x_i,\theta)$.

Then, how can I found an equation representing a decision boundary which satisfy $p(+1|x)=p(-1|y)$?

Best Answer

The decision boundary follows from your last sentence

Then, how can I found an equation representing a decision boundary which satisfy $p(+1|x)= p(−1|x)$?

(correcting $p(-1|y)$ to $p(-1|x)$). The decision boundary is given by the set $$\left\{x\in\mathbb{R}^d:p(+1|x) = p(-1|x)\right\}.$$ Then expanding the condition, \begin{align*} \frac{1}{1+\exp(-\theta^T\Phi(x))} &= \frac{1}{1+\exp(\theta^T\Phi(x))} \\ 1+\exp(-\theta^T\Phi(x)) &= 1+\exp(\theta^T\Phi(x)) \\ -\theta^T\phi(x) &= \theta^T\Phi(x)\\ \Rightarrow \theta^T\Phi(x) &= 0. \end{align*} And so the decision boundary is given by the set, $$\{x\in\mathbb{R}^d:\theta^T\Phi(x) = 0\}.$$

Related Question