论文标题

SADAM:随机亚当,基于一阶梯度优化器的随机操作员

SADAM: Stochastic Adam, A Stochastic Operator for First-Order Gradient-based Optimizer

论文作者

Zhang, Wei, Bao, Yu

论文摘要

在这项工作中,为了有效地帮助避免固定点和马鞍点,我们提出,分析和推广作为一阶梯度下降算法运营商执行的随机策略,以提高目标准确性并降低时间消耗。与现有算法不同,所提出的随机性不需要任何批处理和采样技术,可以有效实施并维持初始一阶优化器的收敛速度,但在优化目标功能时可以无与伦比的目标准确性提高目标准确性。简而言之,提出的策略是广义化的,应用于亚当,并通过使用深矩阵拟合和另外四个同行优化器的生物医学信号分解来验证。验证结果表明,提出的随机策略可以轻松地概括为一阶优化器,并有效提高目标准确性。

In this work, to efficiently help escape the stationary and saddle points, we propose, analyze, and generalize a stochastic strategy performed as an operator for a first-order gradient descent algorithm in order to increase the target accuracy and reduce time consumption. Unlike existing algorithms, the proposed stochastic the strategy does not require any batches and sampling techniques, enabling efficient implementation and maintaining the initial first-order optimizer's convergence rate, but provides an incomparable improvement of target accuracy when optimizing the target functions. In short, the proposed strategy is generalized, applied to Adam, and validated via the decomposition of biomedical signals using Deep Matrix Fitting and another four peer optimizers. The validation results show that the proposed random strategy can be easily generalized for first-order optimizers and efficiently improve the target accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源