Journal
IEEE ACCESS
Volume 7, Issue -, Pages 52769-52777Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2911320
Keywords
Tibetan question answering; hybrid network; convolutional neural network; long short memory network; language model
Categories
Funding
- National Nature Science Foundation of China [61501529, 61331013]
- National Language Committee Project of China [ZDI125-36]
Ask authors/readers for more resources
Currently, research on question answering (QA) with deep learning methods is a hotspot in natural language processing. In addition, most of the research mainly focused on English or Chinese since there are large-scale open corpora, such as WikiQA or DoubanQA. However, how to use deep learning methods to QA of the low resource languages, like Tibetan becomes a challenge. In this paper, we propose a hybrid network model for the Tibetan QA, which combines the convolutional neural network and long short memory network (LSTM) to extract effective features from small-scale corpora. Meanwhile, since the strong grammar rules of Tibetan, we use the language model to decode the output of the LSTM layer which makes the answer more accurate and smoother. In addition, we add the batch normalization to accelerate deep network training and prevent overfitting. Finally, the experiments show that the ACC@1 value of the proposed model in Tibetan QA is 126.2% higher than the baseline model.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available