论文标题

了解平行单通行学习的高维计算

Understanding Hyperdimensional Computing for Parallel Single-Pass Learning

论文作者

Yu, Tao, Zhang, Yichi, Zhang, Zhiru, De Sa, Christopher

论文摘要

高维计算(HDC)是一种具有高维二进制矢量的新兴学习范式。由于其能源效率和低潜伏期,它很有吸引力,尤其是在新兴硬件上 - 但是HDC遭受了低模型的精度,因此对限制其性能的理论了解很少。我们通过考虑二进制向量可以“表达”哪些相似性矩阵来对HDC的限制进行新的理论分析,并展示如何使用随机傅立叶特征(RFF)来处理HDC的限制。我们将分析扩展到更通用的矢量符号体系结构(VSA),这些矢量象征体系结构(VSA)使用不一定是二进制的高维矢量(高度向量)计算。我们提出了一类新的VSA,有限的VSA,它超过了HDC的限制。使用表示理论,我们表征有限组VSA HyperVector可以“表达”哪些相似性矩阵,并且我们展示了如何构建这些VSA。实验结果表明,我们的RFF方法和VSA组都可以在保持硬件效率的同时以高达7.6 \%的优于最先进的HDC模型。

Hyperdimensional computing (HDC) is an emerging learning paradigm that computes with high dimensional binary vectors. It is attractive because of its energy efficiency and low latency, especially on emerging hardware -- but HDC suffers from low model accuracy, with little theoretical understanding of what limits its performance. We propose a new theoretical analysis of the limits of HDC via a consideration of what similarity matrices can be "expressed" by binary vectors, and we show how the limits of HDC can be approached using random Fourier features (RFF). We extend our analysis to the more general class of vector symbolic architectures (VSA), which compute with high-dimensional vectors (hypervectors) that are not necessarily binary. We propose a new class of VSAs, finite group VSAs, which surpass the limits of HDC. Using representation theory, we characterize which similarity matrices can be "expressed" by finite group VSA hypervectors, and we show how these VSAs can be constructed. Experimental results show that our RFF method and group VSA can both outperform the state-of-the-art HDC model by up to 7.6\% while maintaining hardware efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源