论文标题
用于重力波分类的遗传相当性优化神经网络
Genetic-algorithm-optimized neural networks for gravitational wave classification
论文作者
论文摘要
重力波检测策略基于一种信号分析技术,称为匹配的滤波。尽管匹配的过滤取得了成功,但由于其计算成本,最近人们对开发深层卷积神经网络(CNN)的兴趣进行信号检测。设计这些网络仍然是一个挑战,因为大多数程序都采用了试验和错误策略来设置超参数值。我们提出了一种基于遗传算法(气)的高参数优化方法。我们比较了六个不同的GA变体,并探索了GA优化健身评分的不同选择。我们表明,当初始超参数种子值远非一个好的解决方案以及精炼已经良好的网络时,GA可以发现高质量的体系结构。例如,从乔治和Huerta提出的架构开始时,在20维超级参数空间上优化的网络的可训练参数少78%,同时获得了我们的测试问题准确性11%。如果问题上下文(例如噪声,信号模型等)会更改,并且需要重建网络,则使用遗传算法优化来完善现有网络应该特别有用。在我们的所有实验中,我们发现与种子网络相比,GA发现的复杂网络明显较小,这表明它可用于修剪浪费网络结构。尽管我们将注意力限制在CNN分类器上,但我们的GA超参数优化策略可以应用于其他机器学习设置。
Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Despite the success of matched filtering, due to its computational cost, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 78% fewer trainable parameters while obtaining an 11% increase in accuracy for our test problem. Using genetic algorithm optimization to refine an existing network should be especially useful if the problem context (e.g. statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, our GA hyperparameter optimization strategy can be applied within other machine learning settings.