Dismiss
InnovationQ will be updated on Sunday, Jan. 21, from 9am - 11am ET. You may experience brief service interruptions during that time.
Browse Prior Art Database

# Algorithm for Determining the Maximum Likelihood Estimate of Hidden Markov Models Having Tied Parameters

IP.com Disclosure Number: IPCOM000034342D
Original Publication Date: 1989-Feb-01
Included in the Prior Art Database: 2005-Jan-27
Document File: 3 page(s) / 16K

IBM

## Related People

Brown, PF: AUTHOR [+4]

## Abstract

A technique is described whereby the maximum likelihood algorithms for the tied parameters, based on complete data, are used to produce maximum likelihood algorithms based on incomplete data (output sequence only) and the EM algorithm. In the use of interpolated estimation of Markov source parameters from sparse data in a hidden Markov model [*], with multinomial outputs, the pooling of counts for tied probabilities defines a valid maximum likelihood algorithm for the tied model based on incomplete data. The algorithm is generally based on the Baum-Welch re-estimation algorithm, which is a particular instance of the general EM algorithm as is discussed in statistical literature.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 53% of the total text.

Page 1 of 3

Algorithm for Determining the Maximum Likelihood Estimate of Hidden Markov Models Having Tied Parameters

A technique is described whereby the maximum likelihood algorithms for the tied parameters, based on complete data, are used to produce maximum likelihood algorithms based on incomplete data (output sequence only) and the EM algorithm. In the use of interpolated estimation of Markov source parameters from sparse data in a hidden Markov model [*], with multinomial outputs, the pooling of counts for tied probabilities defines a valid maximum likelihood algorithm for the tied model based on incomplete data. The algorithm is generally based on the Baum-Welch re-estimation algorithm, which is a particular instance of the general EM algorithm as is discussed in statistical literature. When the outputs have a parametric conditional distribution of the exponential form, including multinomial, Gaussian, correlated Gaussian, multivariate Gaussian, correlated multivariate Gaussian time series plus others, then it may be necessary to tie the parameters in a variety of ways. The concept described herein provides a means of obtaining the maximum likelihood estimates, whenever the tying is such that the "spirit" of the EM formulation is satisfied, or whenever the complete data maximum likelihood problem is tractable. The algorithm is based on the following observations: Observation 1 -

\$ When the output distribution is of the exponential

family, then so is the complete data distribution. This

follows immediately upon observing that the marginal

distribution of the hidden state sequence is itself of

the exponential form. Let

X X (S,Y) X (hidden state sequence, observable sequence)
(1)

be the complete data. Then X has a probability element

of the form

T

fr (x) = c(r)n(x)er c(x) (2)

where T denotes transpose, r is a vector of unknown

parameters of known length k and c is a vector valued

integrable function of length k. The components of c

that is, c1(x), ... ck(x) are the complete data

sufficient statistics for the problem. The

components of r are related to the usual parameters,

such as transition probabilities and conditional low

order moments of the observable in a one-to-one way.

Observation 2 -

\$ Expectation (conditional or not) is a linear operator on

the space of random variables.

Observation 3 -

\$ The complete data log likelihood function for this

problem is (up to an additive constant)

r Tc(x) + log c(r) (3)

which is a linear function of the random vector c.

1

Page 2 of 3

Observation 4 -

\$ The EM algorithm produces a sequence of iterates rn

which converge to the desired maximum likelihood estimate

based on incomplete data

rµincomplete(y) = limnrn (4) If the parameter r is unconstrained (no tying) in the sense that the parameter space of its possible values includes a k dimensional rectangle ("regular exponential family"), then the maximum step of the EM algorithm can be accomplished by solving rn+1 a system of equities between the condi...