3.8 Proceedings Paper

DG-Trans: Automatic Code Summarization via Dynamic Graph Attention-based Transformer

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/QRS54544.2021.00088

Keywords

automatic code summarization; graph neural network; dynamic graph attention; subword sequence

Funding

  1. Science and Technology Development Fund of Macau [0047/2020/A1]
  2. China Postdoctoral Science Foundation [2017M621247]

Ask authors/readers for more resources

The paper introduces a novel code summarization model named Dynamic Graph attention-based Transformer (DG-Trans), which effectively addresses the limited structural information and Out-Of-Vocabulary problems in existing methods. Extensive experiments demonstrate that DG-Trans outperforms state-of-the-art models, increasing average BLEU scores and ROUGUEL by 8.39% and 8.86%, respectively.
Automatic code summarization is an important topic in the software engineering field, which aims to automatically generate the description for the source code. Based on Graph Neural Networks (GNN), most existing methods apply them to Abstract Syntax Tree (AST) to achieve code summarization. However, these methods face two major challenges: 1) they can only capture limited structural information of the source code; 2) they did not effectively solve Out-Of-Vocabulary (OOV) problems by reducing vocabulary size. In order to resolve these problems, in this paper, we propose a novel code summarization model named Dynamic Graph attention-based Transformer (DG-Trans for short), which effectively captures abundant information of the code subword sequence and utilizes the fusion of dynamic graph attention mechanism and Transformer. Extensive experiments show that DG-Trans is able to outperform state-of-the-art models (such as Ast-Attendgru, Transformer, and CodeGNN) by averagely increasing 8.39% and 8.86% on BLEU scores and ROUGUEL, respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available