Journal
SPEECH COMMUNICATION
Volume 38, Issue 1-2, Pages 19-28Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/S0167-6393(01)00041-3
Keywords
language model training; perplexity; correlation with word error rate
Ask authors/readers for more resources
Many groups have investigated the relationship of word error rate and perplexity of language models. This issue is of central interest because perplexity optimization can be done independent of a recognizer and in most cases it is possible to find simple perplexity optimization procedures. Moreover, many tasks in language model training such as the optimization of word classes may use perplexity as target function resulting in explicit optimization formulas which are not available if error rates are used as target. This paper first presents some theoretical arguments for a close relationship between perplexity and word error rate. Thereafter the notion of uncertainty of a measurement is introduced and is then used to test the hypothesis that word error rate and perplexity are correlated by a power law. There is no evidence to reject this hypothesis. (C) 2002 Elsevier Science B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available