论文标题

通过Cauchy卷积稀疏编码的表示形式学习

Representation Learning via Cauchy Convolutional Sparse Coding

论文作者

Mayo, Perla, Karakuş, Oktay, Holmes, Robin, Achim, Alin

论文摘要

在表示学习中,卷积稀疏编码(CSC)可以通过共同优化A \(\ ell_2 \) - 规范忠实术语和稀疏性执行惩罚来实现无监督学习特征的学习。这项工作使用从CSC生成模型特征图的系数的假定库奇先验得出的正则化项进行了研究。该先验产生的稀疏性惩罚项是通过其近端运算符求解的,然后在元素上迭代地将其应用于特征图的系数上,以优化CSC成本函数。比较了\(\ ell_1 \)的常见选择 - 通过软阈值优化\(\ ell_1 \)的常见选择,将提出的迭代cauchy阈值(ICT)算法的性能进行比较。在大多数这些重建实验中,ICT的表现都优于IHT和IST,跨各个数据集的重建实验,平均PSNR高达ISTA和IHT上方11.30和7.04。

In representation learning, Convolutional Sparse Coding (CSC) enables unsupervised learning of features by jointly optimising both an \(\ell_2\)-norm fidelity term and a sparsity enforcing penalty. This work investigates using a regularisation term derived from an assumed Cauchy prior for the coefficients of the feature maps of a CSC generative model. The sparsity penalty term resulting from this prior is solved via its proximal operator, which is then applied iteratively, element-wise, on the coefficients of the feature maps to optimise the CSC cost function. The performance of the proposed Iterative Cauchy Thresholding (ICT) algorithm in reconstructing natural images is compared against the common choice of \(\ell_1\)-norm optimised via soft and hard thresholding. ICT outperforms IHT and IST in most of these reconstruction experiments across various datasets, with an average PSNR of up to 11.30 and 7.04 above ISTA and IHT respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源