论文标题

与Sebert预测问题类型

Predicting Issue Types with seBERT

论文作者

Trautsch, Alexander, Herbold, Steffen

论文摘要

预训练的变压器模型是自然语言模型处理的当前最新技术。 Sebert是一种基于Bert体系结构开发的模型,但使用软件工程数据进行了培训。我们为问题类型预测的任务进行了NLBSE挑战,对此模型进行了微调。我们的模型在召回率和Precisio中的所有三种问题类型的基线FastText中都占据主导地位,以达到85.7%的总体F1评分,这比基线增长了4.1%。

Pre-trained transformer models are the current state-of-the-art for natural language models processing. seBERT is such a model, that was developed based on the BERT architecture, but trained from scratch with software engineering data. We fine-tuned this model for the NLBSE challenge for the task of issue type prediction. Our model dominates the baseline fastText for all three issue types in both recall and precisio} to achieve an overall F1-score of 85.7%, which is an increase of 4.1% over the baseline.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源