论文标题

密集的Hebbian神经网络:无监督学习的复制品对称图片

Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

论文作者

Agliari, Elena, Albanese, Linda, Alemanno, Francesco, Alessandrelli, Andrea, Barra, Adriano, Giannotti, Fosca, Lotito, Daniele, Pedreschi, Dino

论文摘要

我们考虑了未经监督的训练训练的密集,关联的神经网络,我们通过统计力学方法分析地研究了它们的计算能力,并通过蒙特卡洛模拟从数值上进行了研究。特别是,我们获得了一个相图,汇总了它们的性能作为控制参数的函数,例如训练数据集的质量和数量和网络存储,在较大的网络大小和无结构数据集的限制中有效。此外,我们在统计力学标准使用的宏观可观察物之间建立了一个桥梁,并且通常在机器学习中使用的损失功能。作为技术评论,从分析方面,我们在Guerra的插值内实施了巨大的偏差和稳定分析,以应对后突触潜力所涉及的非高斯分布,而从计算对应物中,我们插入了PLEFKA的近似值,以加快对整个网络的评估,以加快针对整个NE小说的评估,并在整体上进行范围,从而对一般范围进行了广泛的信息,并促进了一般的信息范围。

We consider dense, associative neural-networks trained with no supervision and we investigate their computational capabilities analytically, via a statistical-mechanics approach, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as the quality and quantity of the training dataset and the network storage, valid in the limit of large network size and structureless datasets. Moreover, we establish a bridge between macroscopic observables standardly used in statistical mechanics and loss functions typically used in the machine learning. As technical remarks, from the analytic side, we implement large deviations and stability analysis within Guerra's interpolation to tackle the not-Gaussian distributions involved in the post-synaptic potentials while, from the computational counterpart, we insert Plefka approximation in the Monte Carlo scheme, to speed up the evaluation of the synaptic tensors, overall obtaining a novel and broad approach to investigate neural networks in general.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源