Solved – How to model Hidden markov model with changing transition probability

baum-welchclassificationhidden markov model

I have a series of observations that fall into two outcomes, 0 or 1. These observations have an associated time of observation, as well as additional features that I can gather for that observation. I am modeling this as having two hidden states A and B, where both have some probability of observing 0 or 1, and have some unknown transition probability between states. This transition probability varies with time and is correlated with the observation features.

How would I go about modeling this? My experience with HMM is with fixed transition probabilities (e.g. with Viterbi algorithm). Given a new observation, I want to be able to predict the hidden state as well as the transition probability. I would also want to generalize this model/use it as a prior for other similar models, each with different sets of observations.

Edit: I have discovered that what I am looking for is a variant of Baum-Welch that uses other feature data besides different sequences. How can I use my additional data in the prediction of the states?

Best Answer

What you are after I believe is a maximum entropy the (ie logistic regression) Markov model. Ie you predict the transition probability using logistic regression on previous state and observations. There is apparently a way of training these without knowing the hidden states.. I guess by expectation maximisation

Related Question