论文标题

在基于投影的模型订单还原算法中重复使用预处理

Reusing Preconditioners in Projection based Model Order Reduction Algorithms

论文作者

Singh, Navneet Pratap, Ahuja, Kapil

论文摘要

动态系统在几乎所有工程和科学应用中都无处不在。模拟此类系统在计算上非常密集。因此,使用模型订单降低(MOR)将其降低到较低的维度。大多数MOR算法都需要解决线性系统的大稀疏序列。由于使用直接方法来求解此类系统在输入维度的增加方面并不能很好地扩展,因此通常使用有效的预处理迭代方法。 在我们以前的一项作品中,我们通过重复使用参数MOR的预处理来显示出很大的改进(Singh等,2019)。在这里,我们提出了针对非参数和参数情况的技术,但仅将其应用于后者。我们在这里有四个主要贡献。首先,我们证明,与参数相比,由于前者缺乏参数,因此可以在非参数案例中更有效地重复使用。其次,我们表明,重复使用预处理是一门艺术,它需要对基础MOR算法进行微调。第三,我们描述了重复使用预调节器的算法实现中的陷阱。第四和最终,我们就现实生活中的工业问题(尺寸为120万)演示了这一理论,在总计计算时间内节省了64%的理论,可以通过重复使用预调整器来获得。从绝对的角度来看,这可以节省5天。

Dynamical systems are pervasive in almost all engineering and scientific applications. Simulating such systems is computationally very intensive. Hence, Model Order Reduction (MOR) is used to reduce them to a lower dimension. Most of the MOR algorithms require solving large sparse sequences of linear systems. Since using direct methods for solving such systems does not scale well in time with respect to the increase in the input dimension, efficient preconditioned iterative methods are commonly used. In one of our previous works, we have shown substantial improvements by reusing preconditioners for the parametric MOR (Singh et al. 2019). Here, we had proposed techniques for both, the non-parametric and the parametric cases, but had applied them only to the latter. We have four main contributions here. First, we demonstrate that preconditioners can be reused more effectively in the non-parametric case as compared to the parametric one because of the lack of parameters in the former. Second, we show that reusing preconditioners is an art and it needs to be fine-tuned for the underlying MOR algorithm. Third, we describe the pitfalls in the algorithmic implementation of reusing preconditioners. Fourth, and final, we demonstrate this theory on a real life industrial problem (of size 1.2 million), where savings of upto 64% in the total computation time is obtained by reusing preconditioners. In absolute terms, this leads to a saving of 5 days.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源