论文标题
部分可观测时空混沌系统的无模型预测
Level Constrained First Order Methods for Function Constrained Optimization
论文作者
论文摘要
我们提出了一种新的可行近端梯度方法,用于约束优化,其中目标和约束函数都是通过平滑,可能的非凸功能和凸简单函数的总和给出的。该算法将原始问题转换为一系列凸子问题。制定这些子问题需要评估原始目标和约束函数的最多梯度值。在许多情况下,可以有效地计算精确或近似的子问题解决方案。算法的一个重要特征是约束级别参数。通过小心地提高每个子问题的水平,我们提供了一个简单的解决方案来克服界限拉格朗日乘数的挑战,并表明该算法遵循严格可行的解决方案路径,直到收敛到固定点。我们开发了一个简单的近端梯度下降类型分析,表明该新算法的复杂性结合与不受约束的设置相当,这在文献中是新的。利用这种新的设计和分析技术,我们将算法扩展到一些更具挑战性的约束优化问题,其中1)目标是随机或有限的函数,以及2)结构化的非平滑功能替代了客观和约束功能的平滑组件。这些问题的复杂性结果在文献中似乎也是新的。最后,我们的方法还可以应用于凸功能受限的问题,在这些问题中我们显示的复杂性类似于近端梯度方法。
We present a new feasible proximal gradient method for constrained optimization where both the objective and constraint functions are given by the summation of a smooth, possibly nonconvex function and a convex simple function. The algorithm converts the original problem into a sequence of convex subproblems. Formulating those subproblems requires the evaluation of at most one gradient value of the original objective and constraint functions. Either exact or approximate subproblem solutions can be computed efficiently in many cases. An important feature of the algorithm is the constraint level parameter. By carefully increasing this level for each subproblem, we provide a simple solution to overcome the challenge of bounding the Lagrangian multipliers and show that the algorithm follows a strictly feasible solution path till convergence to the stationary point. We develop a simple, proximal gradient descent type analysis, showing that the complexity bound of this new algorithm is comparable to gradient descent for the unconstrained setting, which is new in the literature. Exploiting this new design and analysis technique, we extend our algorithms to some more challenging constrained optimization problems where 1) the objective is a stochastic or finite-sum function, and 2) structured nonsmooth functions replace smooth components of both objective and constraint functions. Complexity results for these problems also seem to be new in the literature. Finally, our method can also be applied to convex function-constrained problems where we show complexities similar to the proximal gradient method.