论文标题

归因于临床准确X射线报告生成的异常图嵌入

Attributed Abnormality Graph Embedding for Clinically Accurate X-Ray Report Generation

论文作者

Yan, Sixing, Cheung, William K., Chiu, Keith, Tong, Terence M., Cheung, Charles K., See, Simon

论文摘要

从X射线图像中自动生成医疗报告可以帮助放射科医生执行耗时但重要的报告任务。但是,实现临床准确的生成报告仍然具有挑战性。发现使用知识图方法对潜在的异常进行建模有望在提高临床准确性方面。在本文中,我们介绍了一种新型的罚款颗粒知识图结构,称为属性异常图(ATAG)。 ATAG由互连的异常节点和属性节点组成,从而可以更好地捕获异常细节。与手动构建异常图的现有方法相反,我们提出了一种方法,以根据注释自动构建细粒图结构,X射线数据集中的医疗报告以及Radlex放射线词典。然后,我们使用深层模型学习ATAG嵌入,并使用编码器编码器架构进行报告生成。特别是,探索了图表网络以编码异常及其属性之间的关系。采用门控机制并将其与各种解码器整合在一起。我们根据基准数据集进行了广泛的实验,并表明基于ATAG的深层模型优于SOTA方法,并可以提高生成报告的临床准确性。

Automatic generation of medical reports from X-ray images can assist radiologists to perform the time-consuming and yet important reporting task. Yet, achieving clinically accurate generated reports remains challenging. Modeling the underlying abnormalities using the knowledge graph approach has been found promising in enhancing the clinical accuracy. In this paper, we introduce a novel fined-grained knowledge graph structure called an attributed abnormality graph (ATAG). The ATAG consists of interconnected abnormality nodes and attribute nodes, allowing it to better capture the abnormality details. In contrast to the existing methods where the abnormality graph was constructed manually, we propose a methodology to automatically construct the fine-grained graph structure based on annotations, medical reports in X-ray datasets, and the RadLex radiology lexicon. We then learn the ATAG embedding using a deep model with an encoder-decoder architecture for the report generation. In particular, graph attention networks are explored to encode the relationships among the abnormalities and their attributes. A gating mechanism is adopted and integrated with various decoders for the generation. We carry out extensive experiments based on the benchmark datasets, and show that the proposed ATAG-based deep model outperforms the SOTA methods by a large margin and can improve the clinical accuracy of the generated reports.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源