Abstract
We propose a novel method to obtain the N-best list of hypotheses in hidden Markov model (HMM). We show that the entire information needed to compute the N-best list from the HMM trellis graph is encapsulated in entities that can be computed in a single forward-backward iteration that usually yields the most likely state sequence. The hypotheses list can then be extracted in a sequential manner from these entities without the need to refer back to the original data of the HMM. Furthermore, our approach can yield significant savings of computational time when compared to traditional methods.
Original language | English |
---|---|
Pages (from-to) | 1280-1285 |
Number of pages | 6 |
Journal | IJCAI International Joint Conference on Artificial Intelligence |
State | Published - 2001 |
Externally published | Yes |
Event | 17th International Joint Conference on Artificial Intelligence, IJCAI 2001 - Seattle, WA, United States Duration: 4 Aug 2001 → 10 Aug 2001 |