Result: Code samples summarization for knowledge exchange in developer community.
Further Information
A question title's function is to generate readable titles and describe a problem encountered by the code. Previous studies often used an end‐to‐end sequence‐to‐sequence system to generate question title's from source code. However, long‐term dependencies are often difficult to capture, and this may result in an incomplete source code representation. To address this issue, we propose a Transformer for Generating Code Title (hereinafter referred to as TGCT) model. Specifically, the TGCT model uses the position coding mechanism to model paired relationships between source terms by applying relative position representations. Multiple self‐attention mechanism components are also used to capture long‐term dependencies of the code. Comprehensive experiments on datasets from five coding languages, namely Python, Java, JavaScript, C#, and SQL, are conducted, and the results show that TGCT outperforms state‐of‐the‐art models based on the measurements of BLEU and ROUGE in general. In addition, a cross‐sectional comparison experiment was conducted to verify the effects of different model parameters, different data set sizes, position coding mechanism, and self‐attention mechanism on model results. [ABSTRACT FROM AUTHOR]
Copyright of Software: Practice & Experience is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)