论文标题

多域分割的现实图像归一化

Realistic Image Normalization for Multi-Domain Segmentation

论文作者

Delisle, Pierre-Luc, Anctil-Robitaille, Benoit, Desrosiers, Christian, Lombaert, Herve

论文摘要

图像归一化是医学图像分析中的基础。常规的方法通常以人均方式使用。但是,该策略可防止当前的归一化算法完全利用多个数据集可用的复杂关节信息。因此,忽略此类联合信息会直接影响分割算法的性能。本文建议通过学习多个数据集的常见归一化功能来重新审视常规图像归一化方法。共同标准化了多个数据集可产生一致的标准化图像以及改进的图像分割。为此,采用了完全自动化的对抗和任务驱动的归一化方法,因为它促进了对现实且可解释的图像的培训,同时又可以保持最先进的表现。对我们网络的对抗训练旨在找到最佳传输功能,以提高分割精度和生成逼真的图像。我们评估了来自ISEG,MRBRAINS和ABIDE数据集的婴儿和成人大脑图像的归一化器的性能。结果揭示了我们的标准化方法对分割的潜力,而骰子比基线的骰子高达57.5%。我们的方法还可以通过增加从多个成像域学习时可用的样品数量来增强数据的可用性。

Image normalization is a building block in medical image analysis. Conventional approaches are customarily utilized on a per-dataset basis. This strategy, however, prevents the current normalization algorithms from fully exploiting the complex joint information available across multiple datasets. Consequently, ignoring such joint information has a direct impact on the performance of segmentation algorithms. This paper proposes to revisit the conventional image normalization approach by instead learning a common normalizing function across multiple datasets. Jointly normalizing multiple datasets is shown to yield consistent normalized images as well as an improved image segmentation. To do so, a fully automated adversarial and task-driven normalization approach is employed as it facilitates the training of realistic and interpretable images while keeping performance on-par with the state-of-the-art. The adversarial training of our network aims at finding the optimal transfer function to improve both the segmentation accuracy and the generation of realistic images. We evaluated the performance of our normalizer on both infant and adult brains images from the iSEG, MRBrainS and ABIDE datasets. Results reveal the potential of our normalization approach for segmentation, with Dice improvements of up to 57.5% over our baseline. Our method can also enhance data availability by increasing the number of samples available when learning from multiple imaging domains.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源