3.8 Proceedings Paper

An Analysis of BERT in Document Ranking

出版社

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3397271.3401325

关键词

neural networks; explainability; document ranking

资金

  1. National Key Research and Development Program of China [2018YFC0831700]
  2. Natural Science Foundation of China [61732008, 61532011, 61902209]
  3. Beijing Academy of Artificial Intelligence (BAAI)

向作者/读者索取更多资源

Although BERT has shown its effectiveness in a number of IR-related tasks, especially document ranking, the understanding of its internal mechanism remains insufficient. To increase the explainability of the ranking process performed by BERT, we investigate a state-of-the-art BERT-based ranking model with focus on its attention mechanism and interaction behavior. Firstly, we look into the evolving of the attention distribution. It shows that in each step, BERT dumps redundant attention weights on tokens with high document frequency (such as periods). This may lead to a potential threat to the model robustness and should be considered in future studies. Secondly, we study how BERT models interactions between query and document and find that BERT aggregates document information to query token representations through their interactions, but extracts query-independent representations for document tokens. It indicates that it is possible to transform BERT into a more efficient representation-focused model. These findings help us better understand the ranking process by BERT and may inspire future improvement.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据