论文标题
在高维优化问题中涉及有效遗传算法的主动子空间的监督学习方法
A supervised learning approach involving active subspaces for an efficient genetic algorithm in high-dimensional optimization problems
论文作者
论文摘要
在这项工作中,我们提出了遗传算法(GA)的扩展,该算法(GA)利用了称为主动子空间(AS)的监督学习技术,以在较低的维空间上发展个体。在许多情况下,GA实际上需要比其他优化方法更多的功能评估来收敛到全局最佳。因此,复杂和高维函数可能会导致极高的要求(从计算角度)使用标准算法进行优化。为了解决此问题,我们建议将原始函数的输入参数空间线性映射到演化之前,在较低维空间中执行突变和序列过程。在此贡献中,我们描述了称为ASGA的新方法,与标准GA方法呈现出差异和相似性。我们在N维基准函数上测试了提出的方法 - Rosenbrock,Ackley,Bohachevsky,Rastrigin,Schaffer N. 7和Zakharov-最后,我们将其应用于航空形状优化问题。
In this work, we present an extension of the genetic algorithm (GA) which exploits the supervised learning technique called active subspaces (AS) to evolve the individuals on a lower dimensional space. In many cases, GA requires in fact more function evaluations than others optimization method to converge to the global optimum. Thus, complex and high-dimensional functions may result extremely demanding (from computational viewpoint) to optimize with the standard algorithm. To address this issue, we propose to linearly map the input parameter space of the original function onto its AS before the evolution, performing the mutation and mate processes in a lower dimensional space. In this contribution, we describe the novel method called ASGA, presenting differences and similarities with the standard GA method. We test the proposed method over n-dimensional benchmark functions -- Rosenbrock, Ackley, Bohachevsky, Rastrigin, Schaffer N. 7, and Zakharov -- and finally we apply it to an aeronautical shape optimization problem.