3.8 Article

Variable n-grams and extensions for conversational speech language modeling

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/89.817454

Keywords

language modeling; spontaneous speech; variable n-grams

Ask authors/readers for more resources

Recent progress in variable n-gram language modeling provides an efficient representation of n-gram models and makes training of higher order n-grams possible. In this paper, me apply the variable n-gram design algorithm to conversational speech, extending the algorithm to learn skips and context-dependent classes to handle conversational speech characteristics such as filler words, repetitions, and other disfluencies. Experiments show that using the extended variable n-gram results in a language model that captures 4-gram context with less than half the parameters of a standard trigram while also improving the test perplexity and recognition accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available