论文标题
在超声弹性图中结合梯度相似性以进行稳健的时间延迟估计
Incorporating Gradient Similarity for Robust Time Delay Estimation in Ultrasound Elastography
论文作者
论文摘要
基于能量的超声弹性学技术最大程度地减少了由数据和连续性项组成的正规成本函数,以根据射频(RF)帧之间的局部时间估计估计(TDE)获得局部位移估计。与现有技术相关的数据术语仅考虑了幅度相似性,因此对于正在考虑的RF帧中存在的异常样本并不足够强大。该缺点在应变图像中创建了明显的文物。为了解决此问题,我们建议将数据函数作为振幅和梯度相似性约束的线性组合提出。我们估计有关迭代方案后每个相似性项的适应性权重。最后,我们以有效的方式优化了非线性成本函数,以将问题转换为稀疏的线性方程系统,该系统可用于数百万变量。我们称我们的技术rglue:全球超声弹性图中的强大数据术语。使用模拟,幻影,体内肝脏和乳房数据集对rglue进行了验证。在我们的所有实验中,RGLUE在视觉和定量上都大大优于最近的弹性方法。对于模拟,幻影和体内数据集,rglue的信噪比(SNR)(SNR)和61%,19%和25%的对比度与噪声比率(CNR)比Glue(最近出版的弹性出版的弹性弹力术充血。
Energy-based ultrasound elastography techniques minimize a regularized cost function consisting of data and continuity terms to obtain local displacement estimates based on the local time-delay estimation (TDE) between radio-frequency (RF) frames. The data term associated with the existing techniques takes only the amplitude similarity into account and hence is not sufficiently robust to the outlier samples present in the RF frames under consideration. This drawback creates noticeable artifacts in the strain image. To resolve this issue, we propose to formulate the data function as a linear combination of the amplitude and gradient similarity constraints. We estimate the adaptive weight concerning each similarity term following an iterative scheme. Finally, we optimize the non-linear cost function in an efficient manner to convert the problem to a sparse system of linear equations which are solved for millions of variables. We call our technique rGLUE: robust data term in GLobal Ultrasound Elastography. rGLUE has been validated using simulation, phantom, in vivo liver, and breast datasets. In all of our experiments, rGLUE substantially outperforms the recent elastography methods both visually and quantitatively. For simulated, phantom, and in vivo datasets, respectively, rGLUE achieves 107%, 18%, and 23% improvements of signal-to-noise ratio (SNR) and 61%, 19%, and 25% improvements of contrast-to-noise ratio (CNR) over GLUE, a recently-published elastography algorithm.